I have a project with a django and scrapy folder in the same workspace:
my_project/
django_project/
django_project/
settings.py
app1/
app2/
manage.py
...
scrapy_project/
scrapy_project/
settings.py
scrapy.cfg
...
I already connected scrapy with my django app1 model, so every time I run my spider, it stores the collected data in my postgresql db. Here is how my scrapy project can access the django model
import sys
import os
import django
sys.path.append('/../../django_project')
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_project.settings'
django.setup()
Everything works fine when I call a spider from the command line, but when I wanted to name the spider as a script from a django view or Celery task in django, for example:
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
process = CrawlerProcess(get_project_settings())
process.crawl('spider_name')
process.start()
I get an error message:
KeyError: 'Spider not found: spider_name'
I think I should tell Django where Scrapy is located (as I did in the scrapy settings), but I don't know how to do it. Honestly, I'm not even sure that I have properly designed my folder structure for this project.