sep/sep-011.rst
======= ================================ SEP 11 Title Process models for Scrapy Created 2009-11-16 Status Partially implemented - see #168 ======= ================================
There is an interest of supporting different process models for Scrapy, mainly to help prevent memory leaks which affect running all spiders in the same process.
By running each spider on a separate process (or pool of processes) we'll be able to "recycle" process when they exceed a maximum amount of memory.
The user could choose between different process models:
Using different processes would increase reliability at the cost of performance.