Has anyone had any trouble with the crawler simply stopping and hanging after several thousand files? I’m wondering if there are specific files that crash the crawler or if I’m running out of memory.
Is there a logfile or something to look at? I’ve been watching the crawling in the console and it seems to halt around 4600 files.
Ideas for troubleshooting would be appreciated.
L