-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a one-shot method? #8
Comments
It would be convenient to be able to do this in one shot. That said, the current implementation is flexible. This is my current usage to compress data for sharing in a URL hash for open checklist /**
* converts text to gzip compressed hexText
* @param {string} text - a string
* @returns {Promise<string>} gzip compressed hexText
*/
async function compress(text) {
// https://developer.mozilla.org/en-US/docs/Web/API/CompressionStream
const readStream = new Response(text);
const compressedReadableStream = readStream.body.pipeThrough(new CompressionStream("gzip"));
const response = new Response(compressedReadableStream);
const buffer = await response.arrayBuffer();
const compressedHexText = convertBufferToHex(buffer);
return compressedHexText;
} |
@wandyezj Thank you for your use case. Using a Response to convert a stream to an array buffer is convenient, but inefficient. See https://wicg.github.io/compression/#example-deflate-compress for a faster way to do it. |
Did more investigating into multiple ways of using the Compression Streams API, and I realized that this is a really nice setup to have, possibly for a standalone module outside of NBTify. This provides a more cross-platform zlib API which would be similar to using `node:zlib`, but available both in the browser, and Node! It's build off of the Compression Streams API too, so it's only an API wrapper, and possibly a polyfill where necessary. I'm going to try settings things up first here, then move them to a dedicated project which both handles the compression polyfills, and provides these more functional-based APIs, which both provide more performance than my previous streams-specific `Blob` handling that I had before, and are simpler to use directly with `ArrayBuffer` and the various `TypedArray` objects, rather than having to go back and forth between `Blob` containers, which does appear to be a little less performant with the simple test that I set up. Oh my, that last sentence was horrible, I'm sorry. I guess I'm more of a dev than I am a writer. Offroaders123/Dovetail#1 whatwg/compression#8 https://wicg.github.io/compression/#example-deflate-compress https://jakearchibald.com/2017/async-iterators-and-generators/#making-streams-iterate
Did more investigating into multiple ways of using the Compression Streams API, and I realized that this is a really nice setup to have, possibly for a standalone module outside of NBTify. This provides a more cross-platform zlib API which would be similar to using `node:zlib`, but available both in the browser, and Node! It's build off of the Compression Streams API too, so it's only an API wrapper, and possibly a polyfill where necessary. I'm going to try settings things up first here, then move them to a dedicated project which both handles the compression polyfills, and provides these more functional-based APIs, which both provide more performance than my previous streams-specific `Blob` handling that I had before, and are simpler to use directly with `ArrayBuffer` and the various `TypedArray` objects, rather than having to go back and forth between `Blob` containers, which does appear to be a little less performant with the simple test that I set up. Oh my, that last sentence was horrible, I'm sorry. I guess I'm more of a dev than I am a writer. Offroaders123/Dovetail#1 whatwg/compression#8 https://wicg.github.io/compression/#example-deflate-compress https://jakearchibald.com/2017/async-iterators-and-generators/#making-streams-iterate
I'm using the Compression API to encode data for URL fragments, a one-shot method would be really nice to have compared to piping the text through multiple streams to get there. |
Maybe we should have something like
in CompressionStream, and a similar API for DecompressionStream, to make one-shot compression and decompression easier to do.
The text was updated successfully, but these errors were encountered: