-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
long text takes very long time in compression #2
Comments
I have exactly the same problem for a JSON string of length 895726, takes very long to execute LZString::compressToBase64(). |
hey guys, i've spent some time the last days to solve this problem. So for a faster implementation i think an approach with the pack and unpack funcitons should do it. Sadly or luckily i'm a full time employee and don't have that much time to put into this issue. So if some of you guys like to, feel free to fork this project! |
+1 It is unfortunate that it's too slow to decode/encode. |
Hello, Does anyone found a way to speed up the process ? For 66000 characters string, it taks 24 seconds with compressToUTF16. And I would need to compress a +1million characters... |
Hi, Thanks for making this library. I also came on the issue of exponentially increasing complexity of compression and decomrpession depending on compressible size. I have spent a night on it and have a fix in my clone of the repo https://github.com/peetervois/lz-string-php version 71d4bf8 or the version master latest. I am using compression and decompression from URI safe strings. In server side is PHP and in frontend is the javascript variant. These work together. The change also affected other methods and I have not tested those. |
I have a text of length 357813.
When trying to compress it takes a very long time.
I use LZString::compressToBase64 .
While when trying the same string on js LZString it runs very fast.
Note: The string is a stringified JSON object.
The text was updated successfully, but these errors were encountered: