-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add KLMangaSH #976
base: master
Are you sure you want to change the base?
Add KLMangaSH #976
Conversation
web/src/engine/websites/KLMangaSh.ts
Outdated
'X-Requested-With': 'XMLHttpRequest', | ||
} | ||
}); | ||
let { img_index, mes, going } = await FetchJSON<PageResult>(request); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The FetchPages
method must avoid making too many requests.
Either parallelize the requests, or better move the logic to FetchImage
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ikr, but we DONT KNOW the real number of pages so both of your solutions are impossible.
The real number of pages is known once the website api return going = 0 ie. after the loop.
And we cant wait for the website to finish (using PageSinglePageJS) as they put a 1 second delay between requests, it will be way too long anyway.
Crappy website coding.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe there are some hints in RawLazy since it seems to use the same WordPress Theme
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've checked rawlazy before submitting that PR, since i saw the similarities. Rawlazy is simply using lazy loading, with one img element per page. Here we have nothing afaik.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean there is a dirty way. Make the request once, you get first two pages.
Then we have the page folder.
hoster.com/path/number.extension.
Since the pages urls follow a pattern we could send HEAD requests while incrementing number.
Should i go this way?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May find the last page by using divide and conquer algorithm, within 6 requests it would be possible to find the last API call which provides a non empty content (assuming there are < 128 pages).
See: ScanVFOrg
You may leave as is for now, may take a look into it later
The filenames should have the same casing, either |
manga-download/hakuneko#7669