Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Concurrent job runners #2

Closed
guewen opened this issue Nov 2, 2016 · 7 comments
Closed

[Feature] Concurrent job runners #2

guewen opened this issue Nov 2, 2016 · 7 comments
Labels
enhancement stale PR/Issue without recent activity, it'll be soon closed automatically.

Comments

@guewen
Copy link
Member

guewen commented Nov 2, 2016

In a use case with several hosts running Odoo, we might want either to:

  • allow several jobrunners to run concurrently on several hosts, which might be hard due to the queues kept in local memory
  • elect a master jobrunner allowing to work, the other ones wait and one of them takes over if the master is down
@lasley
Copy link
Contributor

lasley commented Nov 4, 2016

IMO the best way to implement this would be a Message Broker, such as RabbitMQ.

We would just need to advertise to and ingest from a simple Work queue. In the event of the master down, an Alternate Exchange would do nicely I think.

adrienpeiffer pushed a commit to acsone/queue that referenced this issue Dec 7, 2016
Some proposal after functional testing
@yucer
Copy link

yucer commented Mar 16, 2017

Odoo optionally uses gevent for the bus. So the dependency is already there.

Maybe gevent.queue can be used with that purpose.

@thomaspaulb
Copy link

Elect a master jobrunner allowing to work, the other ones wait and one of them takes over if the master is down

Could this be done via a database lock? Eg. each Odoo instance regularly tries to grab a database lock by which to become the master job runner; if it succeeds, it becomes, if not, does nothing and tries again after a while.

@thomaspaulb
Copy link

@nilshamerlinck
Copy link
Contributor

nilshamerlinck commented Sep 7, 2021

Hi @thomaspaulb you might want to have a look at #256 :)

@thomaspaulb
Copy link

@nilshamerlinck I actually just found that one too haha, thanks.

@github-actions
Copy link

github-actions bot commented Mar 6, 2022

There hasn't been any activity on this issue in the past 6 months, so it has been marked as stale and it will be closed automatically if no further activity occurs in the next 30 days.
If you want this issue to never become stale, please ask a PSC member to apply the "no stale" label.

@github-actions github-actions bot added the stale PR/Issue without recent activity, it'll be soon closed automatically. label Mar 6, 2022
aaltinisik pushed a commit to aaltinisik/queue that referenced this issue Dec 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement stale PR/Issue without recent activity, it'll be soon closed automatically.
Projects
None yet
Development

No branches or pull requests

5 participants