Scrapy顺序执行多个爬虫 有两种方式:
第一种:bat方式运行
新建bat文件
cd C:\python_web\spiders\tiktokSelenium & C: & scrapy crawl spider1 & scrapy crawl spider2 & scrapy crawl spider3 & scrapy crawl spider4
第二种:
使用subprocess按照顺序执行多个爬虫,新建一个start.py文件,输入一下内容,
def crawl_work():
subprocess.Popen('scrapy crawl spider1', shell=True).wait()
subprocess.Popen('scrapy crawl spider2', shell=True).wait()
subprocess.Popen('scrapy crawl spider3', shell=True).wait()
subprocess.Popen('scrapy crawl spider4', shell=True).wait()
if __name__ == '__main__':
crawl_work()
然后
cd C:\python_web\spiders\tiktokSelenium & C: & python ./start.py