[GH-ISSUE #531] Can asynq add a scheduled task dynamically using the Rset API trigger? #250

Open
opened 2026-03-02 05:19:56 +03:00 by kerem · 3 comments
Owner

Originally created by @daicheng123 on GitHub (Aug 30, 2022).
Original GitHub issue: https://github.com/hibiken/asynq/issues/531

Originally assigned to: @hibiken on GitHub.

just like python celery,thanks

Originally created by @daicheng123 on GitHub (Aug 30, 2022). Original GitHub issue: https://github.com/hibiken/asynq/issues/531 Originally assigned to: @hibiken on GitHub. just like python celery,thanks
Author
Owner

@hibiken commented on GitHub (Aug 31, 2022):

I'm not familiar with Celery's API you are referring to. Would you mind providing a link or adding a detailed explanation?

<!-- gh-comment-id:1232430722 --> @hibiken commented on GitHub (Aug 31, 2022): I'm not familiar with Celery's API you are referring to. Would you mind providing a link or adding a detailed explanation?
Author
Owner

@daicheng123 commented on GitHub (Aug 31, 2022):

@transaction.atomic()
@method_decorator(advance_logger('create a cron task by user'))
def post(self, request):
    cron_info = request.data['cron_form']
    work_id = request.data['work_id']
    work_object = Cluster_workinfo.objects.get(id=work_id)
    task_info = {
        'cluster_id': work_object.cluster_id,
        'work_id': work_object.id,
        'work_name': work_object.work_name
    }
    task_type = 1
    s_id = transaction.savepoint()
    try:
        schedule, _ = CrontabSchedule.objects.get_or_create(
                                                            minute=cron_info['min'],
                                                            hour=cron_info['hour'],
                                                            day_of_month=cron_info['day'],
                                                            month_of_year=cron_info['mon'],
                                                            day_of_week=cron_info['week'],
                                                            timezone='Asia/Shanghai'
                                                        )
        pt_object = PeriodicTask.objects.create(
                                                crontab=schedule,
                                                name=work_object.work_name,
                                                task='sysintegration.tasks.cluster_task',
                                                args=json.dumps([task_info, task_type]),
                                                enabled=cron_info['cronRun'])

just like this

<!-- gh-comment-id:1232784125 --> @daicheng123 commented on GitHub (Aug 31, 2022): @transaction.atomic() @method_decorator(advance_logger('create a cron task by user')) def post(self, request): cron_info = request.data['cron_form'] work_id = request.data['work_id'] work_object = Cluster_workinfo.objects.get(id=work_id) task_info = { 'cluster_id': work_object.cluster_id, 'work_id': work_object.id, 'work_name': work_object.work_name } task_type = 1 s_id = transaction.savepoint() try: schedule, _ = CrontabSchedule.objects.get_or_create( minute=cron_info['min'], hour=cron_info['hour'], day_of_month=cron_info['day'], month_of_year=cron_info['mon'], day_of_week=cron_info['week'], timezone='Asia/Shanghai' ) pt_object = PeriodicTask.objects.create( crontab=schedule, name=work_object.work_name, task='sysintegration.tasks.cluster_task', args=json.dumps([task_info, task_type]), enabled=cron_info['cronRun']) just like this
Author
Owner

@kamikazechaser commented on GitHub (Oct 6, 2022):

Yes it can, you will have to write your own HTTP interface and queue the the periodic job using the built in Go client or use something like -> https://github.com/newlife/asynq-py. The worker/processor will still have to be written in Go.

Note: Afaik, this isn't a Celery like protocol that can allow you to write both client and processors in any language. See #105

<!-- gh-comment-id:1269580792 --> @kamikazechaser commented on GitHub (Oct 6, 2022): Yes it can, you will have to write your own HTTP interface and queue the the periodic job using the built in Go client or use something like -> https://github.com/newlife/asynq-py. The worker/processor will still have to be written in Go. Note: Afaik, this isn't a Celery like protocol that can allow you to write both client and processors in any language. See #105
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/asynq#250
No description provided.