You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I keep trying to scan some very large websites with this tool, but it keeps failing.
Is there a maximum number of links that a site can crawl? I'd love to know if I should just cap it at 5000 or if I can basically crawl a sub-domain and gather the results.
Right now I'm finding that the scans are just failing if I cross some threshold.
The text was updated successfully, but these errors were encountered:
I keep trying to scan some very large websites with this tool, but it keeps failing.
Is there a maximum number of links that a site can crawl? I'd love to know if I should just cap it at 5000 or if I can basically crawl a sub-domain and gather the results.
Right now I'm finding that the scans are just failing if I cross some threshold.
The text was updated successfully, but these errors were encountered: