I've had Google request over two thousand pages in under three minutes before. Sometimes 30 Googlebots grab stuff at one time. Googlebot is a very unfriendly spider for sites with lots of pages but limited hardware. Also, dispite its ability to grab urls with query vars, it seems to lose interest in a site without grabbing everything. I wonder if it limits depth, maybe using the number of query vars as a substitute for depth.