-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Print titles of pages which dumpgenerator.py failed to download for MemoryError or other fatals #282
Comments
It is usually saved in the errors log. Can you see it? |
No. I know that. I am saying that some pages are missing from the dump and from the list of pages downloaded. The error log you are talking about is different. If i know which pages went missing when downloading the dump i can download them seperately. Thanks |
emijrp |
@emijrp see usually it probably happenes with only very large wikis. The wiki i am talking about has over thirty thousand pages. |
Hey bro. Just a question. I was actually talking about the pages that
fail because of insufficient ram. If i add a delay will it be solved?
I am talking about large or very large wiki like naruto or
allthetroopes . Thanks.
…On 10/17/16, Emilio J. Rodríguez-Posada ***@***.***> wrote:
It is usually saved in the errors log. Can you see it?
--
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
#282 (comment)
|
As was explained elsewhere, the usual workaround has been to download those manually with Special:Export on a browser. The current solution is to use the API and limit the size of the responses, which is mostly solved by using the command line option |
Is it possible to add a function so it output's a list of pages that are missing and were not downloaded? I use this software often and i have pages missing. If I have a list of which pages failed I can download them with special:Export.
The text was updated successfully, but these errors were encountered: