Is it because no body cares if a minecraft project break GPL just like https://github.com/ProtocolSupport/ProtocolSupport/issues/1223?
@DemonWav
pacman -S brain pls
https://github.com/PaperMC/Paper/issues/3586#issuecomment-646974737
Provide an alternative and stop trolling.
I think repackaging the jar as a tar can do the work
Here's a proper rebuttal for once, as you obviously have not checked how ZIPs and tars work. Also, fuck off still.
Let's first get it said that JARs can have compressed entries:
https://en.wikipedia.org/wiki/JAR_(file_format)
JAR file elements may be compressed, shortening download times.
And as JARs are just ZIPs in disguise, their format is kept:
https://en.wikipedia.org/wiki/Zip_(file_format)
.ZIP files are archives that store multiple files. ZIP allows contained files to be compressed using many different methods, as well as simply storing a file without compressing it. Each file is stored separately, allowing different files in the same archive to be compressed using different methods.
And now for TARs:
https://en.wikipedia.org/wiki/Tar_(computing)
Each file object includes any file data, and is preceded by a 512-byte header record. The file data is written unaltered except that its length is rounded up to a multiple of 512 bytes.
So you're proposing we replace a compression-supporting format with one which does not do that. If the JAR (i.e. ZIP) file was uncompressed, you would have almost the exact same format, resulting in almost no gain whatsoever. If compression is used, which it is by default, you're GROWING the file by using a tar.
Stop trolling, read into what you say, then come with an educated opinion and proposal.
@Proximyst I know how ZIPs and tars work. I think tars can have more same bytes so patch can be smaller
I think
Ah, there it is. That's the problem here.
No. You do not know how tars work. They're uncompressed, i.e. they store the ENTIRE file without altering it in ANY way, and no mainstream JRE implementations do not support reading them, even less so when wrapped and compressed with GZ, XZ, LZMA, or other algorithms. ZIPs do support compression, ultimately leading to smaller files than tars can create.
Read properly into what you're proposing and accept the fact you have not done so yet. Read my response, I made it very clear where you were wrong in your response to that very message.
Either you implement more effective patch yourself and PR it, or get out.
Also nobody does really care about how effective the patch system is in terms of size. People have more important things to do, then trying to save a few bytes of space here.
On top of that you are using a wrong repo for discussion.
@Proximyst I know how tars work. It's that use bsdiff to create patches for tars, not zips.
@Shevchik It's not just the problem of the size.
@Proximyst I know how tars work. It's that use bsdiff to create patches for tars, not zips.
Then stop wasting time here and implement it. Fork paperclip, do it, and prove that the result is better, smaller, and functional.
@ Proximyst I know how tars work. It's that use bsdiff to create patches for tars, not zips.
You've repeated wrong assumptions of how tars work over and over again, continuously proving you don't have the slightest of grip on what you're talking about. Read my response - it's very clear that you're vitally wrong. I picked out the most important quotes for you, so that it's the least necessary reading to see how you're wrong.
bsdiff will not work much differently; it's still the same content, the only difference is that it cannot have equally compressed files, which would be a giant issue. Look at this:
35M mojang_1.15.2.jar - Mojang's 1.15.2 jar
49M patched_1.15.2.jar - Paper's PATCHED 1.15.2 jar
43M ../paper-355.jar - Paper's 1.15.2 Paperclip jar
135M mojang - Mojang's jar unpacked (a little SMALLER than tars would be)
175M paper - Paper's jar unpacked (a little SMALLER than tars would be)
Read the following sources:
@Proximyst I understood all of these.
It's not same binaries for bsdiff.
Because zip is compressed, there tend to be more different bytes which may make the size of the patch big. Also the patch will be compressed.
He means that PaperClip does a diff between jars, and diffs between 2 compressed files can be large (because they compressed differently).
Diffing the tars won't make things better however, because it will start to depend on classes order in the archive (diffing jars does also depend on order, so indeed diffing archives ain't good idea). Creating individual classes patches instead (life forge does for example) will probably decrease the size.
But as i said, nobody cares, so unless there is a PR doing that, nobody would even look into that.
Why? Just why? Why do you want to reduce the size of something which takes up so little space, on the scale of modern disk space?
If you think there is a better alternative, demonstrate it. Otherwise no one will care. Paperclip works and its space has never been an issue.
@A248 https://github.com/PaperMC/Paper/issues/3589#issuecomment-646983559
It's not just the problem of the size.
Then what is the problem? What are you trying to achieve?
I have no idea how paperclip would violate the GPL linking clause because of its disk size.
I've made a proof-of-concept where it first decompresses the Mojang and Paper (non-paperclip) jars, generates a patch file, then makes a compressed paperclip. The paperclip also decompresses the Mojang jar before patching. The result was a 70MB jar file, quickly pointing towards your solution being ineffective. Time taken to make the jar was also notably longer, and patching was longer. There is no gain whatsoever by using your proposal.
Please do not make any more issues on this topic unless it is a pull request showing a significant improvement.
Most helpful comment
Ah, there it is. That's the problem here.