Задать вопрос
vikkyshostak
@vikkyshostak
< This head full of dreams.

VDS Debian 9 + Redis. Почему не выполняются таски Celery в Django 2.x?

Доброго времени года.

Мой конфиг: VDS на Debian 9.3 (1х ядро, 512 Мб ОЗУ), стандатный Python 3.5.3 и Redis 3.2.6 (из коробки Дебиана). Проект на Django 2.0.3, использующий очередь задач Celery 4.1.0.

Redis запущен:

$ redis-server --version
Redis server v=3.2.6 sha=00000000:0 malloc=jemalloc-3.6.0 bits=64 build=826601c992442478

$ redis-cli ping
PONG

Настройки Celery в Django самые стандартные (из доков):

######
# ./myproject/celery.py
######

# Import __future__
from __future__ import absolute_import, unicode_literals
# Import Python packages
import os
# Import Celery
from celery import Celery

# Set the default Django settings module for the 'celery' program
os.environ.setdefault(
    'DJANGO_SETTINGS_MODULE', 'myproject.settings.base' 
    # где settings — просто папка с тремя конфигами
    # dev.py и prod.py — на разное окружение
    # base.py — общие настройки (там, где INSTALLED_APPS)
)

# Init Celery
app = Celery('myproject')

# Using a `CELERY_` prefix
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs
app.autodiscover_tasks()


######
# ./myproject/__init__.py
######

# Import __future__
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ['celery_app']

Конфиг Celery в base.py такой:

######
# ./myproject/settings/base.py
######

...
BROKER_URL = 'redis://127.0.0.1:6379/0'  # Redis
BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 3600}

CELERY_BROKER_URL = BROKER_URL
CELERY_RESULT_BACKEND = BROKER_URL
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
CELERY_IGNORE_RESULT = True
...

Тестовый таск вот такой:

######
# ./app/tasks.py
######

# Import __future__
from __future__ import absolute_import, unicode_literals
# Import Celery
from celery import shared_task

@shared_task
def send_welcome_mail_new_user(user_email):

    # Set defaults
    from_email = 'test@myproject.ru'
    subject = _('Welcome!')
    html_content = render_to_string('mails/mails_welcome_mail_new_user.html', {
        'subject': subject,
    })

    # Make `msg` and send
    msg = EmailMessage(subject, html_content, from_email, [user_email])
    msg.content_subtype = 'html'
    msg.send()

Вызываю его, как обычно, через send_welcome_mail_new_user.delay('mail@user.ru'). Например, во вьюхе для главной страницы при каждом переходе на неё — ради теста.

Для данного таска ещё доп. конфиг для отправки почты:

######
# ./myproject/settings/base.py
######

EMAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend'
EMAIL_HOST = 'smtp.mail.ru'
EMAIL_HOST_PASSWORD = 'password'
EMAIL_HOST_USER = 'test@myproject.ru'
EMAIL_USE_SSL = True
EMAIL_PORT = 465

Демон для Celery на VDS настраивался полностью по этому мануалу: https://pythad.github.io/articles/2016-12/how-to-r... (на самом деле, там просто скомпилированы ответы со StackOverflow).

Конфиги:

######
# /etc/default/celeryd
######

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/bin/celery"

# App instance to use
CELERY_APP="myproject"

# Where to chdir at start.
CELERYD_CHDIR="/var/www/html/myproject"

# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"

# %n will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"

# Workers should run as an unprivileged user.
#   You need to create this user manually (or you can choose
#   a user/group combination that already exists (e.g., nobody).
CELERYD_USER="celery"
CELERYD_GROUP="celery"

# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1

export SECRET_KEY="secret"


######
# /etc/default/celerybeat
######

# Absolute or relative path to the 'celery' command:
CELERY_BIN="/usr/local/bin/celery"

# App instance to use
# comment out this line if you don't use an app
CELERY_APP="myproject"

# Where to chdir at start.
CELERYBEAT_CHDIR="/var/www/html/myproject"

# Extra arguments to celerybeat
CELERYBEAT_OPTS="--schedule=/var/run/celery/celerybeat-schedule"

Всё запущено и (вроде как) работает:

$ sudo /etc/init.d/celeryd start

celery init v10.1.
Using config script: /etc/default/celeryd
celery multi v4.1.0 (latentcall)
> Starting nodes...
	> celery@cs202414: OK

$ sudo /etc/init.d/celeryd status

celery init v10.1.
Using config script: /etc/default/celeryd
celeryd (node celery) (pid 22335) is up...

$ sudo /etc/init.d/celerybeat status

celery init v10.1.
Using configuration: /etc/default/celeryd, /etc/default/celerybeat
celerybeat (pid 20183) is up...

Далее, дёргаю таск (просто захожу на главную)... но ничего не происходит — письмо не улетает. Иду в логи Celery по адресу /var/log/celery/celery.log, а там всё вот в таких варнингах:

...
During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/celery/worker/worker.py", line 203, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python3.5/dist-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/usr/local/lib/python3.5/dist-packages/celery/bootsteps.py", line 370, in start
    return self.obj.start()
  File "/usr/local/lib/python3.5/dist-packages/celery/worker/consumer/consumer.py", line 320, in start
    blueprint.start(self)
  File "/usr/local/lib/python3.5/dist-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/usr/local/lib/python3.5/dist-packages/celery/worker/consumer/consumer.py", line 596, in start
    c.loop(*c.loop_args())
  File "/usr/local/lib/python3.5/dist-packages/celery/worker/loops.py", line 88, in asynloop
    next(loop)
  File "/usr/local/lib/python3.5/dist-packages/kombu/async/hub.py", line 293, in create_loop
    poll_timeout = fire_timers(propagate=propagate) if scheduled else 1
  File "/usr/local/lib/python3.5/dist-packages/kombu/async/hub.py", line 136, in fire_timers
    entry()
  File "/usr/local/lib/python3.5/dist-packages/kombu/async/timer.py", line 68, in __call__
    return self.fun(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.5/dist-packages/kombu/async/timer.py", line 127, in _reschedules
    return fun(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/billiard/pool.py", line 1316, in maintain_pool
    sys.exc_info()[2])
  File "/usr/local/lib/python3.5/dist-packages/billiard/five.py", line 123, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.5/dist-packages/billiard/pool.py", line 1307, in maintain_pool
    self._maintain_pool()
  File "/usr/local/lib/python3.5/dist-packages/billiard/pool.py", line 1299, in _maintain_pool
    self._repopulate_pool(joined)
  File "/usr/local/lib/python3.5/dist-packages/billiard/pool.py", line 1284, in _repopulate_pool
    self._create_worker_process(self._avail_index())
  File "/usr/local/lib/python3.5/dist-packages/celery/concurrency/asynpool.py", line 439, in _create_worker_process
    return super(AsynPool, self)._create_worker_process(i)
  File "/usr/local/lib/python3.5/dist-packages/billiard/pool.py", line 1116, in _create_worker_process
    w.start()
  File "/usr/local/lib/python3.5/dist-packages/billiard/process.py", line 124, in start
    self._popen = self._Popen(self)
  File "/usr/local/lib/python3.5/dist-packages/billiard/context.py", line 333, in _Popen
    return Popen(process_obj)
  File "/usr/local/lib/python3.5/dist-packages/billiard/popen_fork.py", line 24, in __init__
    self._launch(process_obj)
  File "/usr/local/lib/python3.5/dist-packages/billiard/popen_fork.py", line 72, in _launch
    self.pid = os.fork()
MemoryError: [Errno 12] Cannot allocate memory

Запускаю проект на локалке (macOS 10.13.3) с параметрами: celery -A my project worker -l info. Точно так же дёргаю таск и письмо улетает в консоль (для dev у меня включён консольный бэкенд для почты).

Буду очень рад толковым советам и/или юзкейсам про настройку данных штуковин на продакшене — уже сутки не могу победить это :( Если нужно показать ещё какие-либо конфиги/логи — с удовольствием предоставлю, пишите в комменты.

Заранее спасибо!
  • Вопрос задан
  • 297 просмотров
Подписаться 1 Простой 6 комментариев
Пригласить эксперта
Ваш ответ на вопрос

Войдите, чтобы написать ответ

Похожие вопросы