В логе, нам мой взгляд, все нормально:
2016-04-19 17:12:36 [scrapy] INFO: Scrapy 1.0.1 started (bot: intelxeon)
2016-04-19 17:12:36 [scrapy] INFO: Optional features available: ssl, http11
2016-04-19 17:12:36 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'intelxeon.spiders', 'FEED_URI': 'xeon.csv', 'SPIDER_MODULES': ['intelxeon.spiders'], 'BOT_NAME': 'intelxeon', 'LOG_STDOUT': True, 'FEED_FORMAT': 'csv', 'LOG_FILE': 'C:/log.txt'}
2016-04-19 17:12:36 [scrapy] INFO: Enabled extensions: CloseSpider, FeedExporter, TelnetConsole, LogStats, CoreStats, SpiderState
2016-04-19 17:12:37 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2016-04-19 17:12:37 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2016-04-19 17:12:37 [scrapy] INFO: Enabled item pipelines:
2016-04-19 17:12:37 [scrapy] INFO: Spider opened
2016-04-19 17:12:37 [scrapy] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2016-04-19 17:12:37 [scrapy] DEBUG: Telnet console listening on 127.0.0.1:6023
2016-04-19 17:12:38 [scrapy] DEBUG: Crawled (200) <GET http://ark.intel.com/products/family/78581> (referer: None)
2016-04-19 17:12:39 [scrapy] DEBUG: Crawled (200) <GET http://ark.intel.com/products/family/78581> (referer: http://ark.intel.com/products/family/78581)
...
2016-04-19 17:12:40 [scrapy] INFO: Closing spider (finished)
2016-04-19 17:12:40 [scrapy] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 16136,
'downloader/request_count': 51,
'downloader/request_method_count/GET': 51,
'downloader/response_bytes': 697437,
'downloader/response_count': 51,
'downloader/response_status_count/200': 51,
'dupefilter/filtered': 8,
'finish_reason': 'finished',
'finish_time': datetime.datetime(2016, 4, 19, 13, 12, 40, 99000),
'log_count/DEBUG': 53,
'log_count/INFO': 7,
'request_depth_max': 1,
'response_received_count': 51,
'scheduler/dequeued': 51,
'scheduler/dequeued/memory': 51,
'scheduler/enqueued': 51,
'scheduler/enqueued/memory': 51,
Если в командной строке указать для вывода -o c:\xeon.csv , то появляется ошибка такого плана:
2016-04-19 17:14:33 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'intelxeon.spiders', 'FEED_URI': 'c:\xeon.csv', 'SPIDER_MODULES': ['intelxeon.spiders'], 'BOT_NAME': 'intelxeon', 'LOG_STDOUT': True, 'FEED_FORMAT': 'csv', 'LOG_FILE': 'C:/log.txt'}
2016-04-19 17:14:33 [scrapy] ERROR: Unknown feed storage scheme: c
Насчет полей - я все убрал для чистоты эксперимента, оставил только
item['url'] = response.request.url
это ведь по-любому должно выводиться?