|
Post by AnthroHeart on May 27, 2021 3:39:31 GMT
redenCan you watch this and get an idea for the number we are going for in how big our HASH should be to help guarantee uniqueness? There is one about the Boltzmann Brain, but I don't know if we need to go that far.
|
|
|
Post by reden on May 27, 2021 3:44:00 GMT
reden Can you watch this and get an idea for the number we are going for in how big our HASH should be to help guarantee uniqueness? There is one about the Boltzmann Brain, but I don't know if we need to go that far. I did some calculations, considering that SHA256 is 2^64 bits, therefore what could contain the quantum observable states is something like 2^2048 bits, or SHA2048, if it existed. The arbitrary size hash functions BLAKE3, RadioGatun, SHA-3 (subset of Keccak) and Skein can do this.
|
|
|
Post by hambumger1 on May 27, 2021 3:49:03 GMT
That's one of my favorite videos
|
|
|
Post by AnthroHeart on May 27, 2021 3:50:35 GMT
reden Can you watch this and get an idea for the number we are going for in how big our HASH should be to help guarantee uniqueness? There is one about the Boltzmann Brain, but I don't know if we need to go that far. I did some calculations, considering that SHA256 is 2^64 bits, therefore what could contain the quantum observable states is something like 2^2048 bits, or SHA2048, if it existed. The arbitrary size hash functions BLAKE3, RadioGatun, SHA-3 (subset of Keccak) and Skein can do this. Cool. Do you want me to write a quick program that can write whatever hash you want to a TXT file, however many times you want? And just make it that simple?
|
|
|
Post by reden on May 27, 2021 3:53:13 GMT
I did some calculations, considering that SHA256 is 2^64 bits, therefore what could contain the quantum observable states is something like 2^2048 bits, or SHA2048, if it existed. The arbitrary size hash functions BLAKE3, RadioGatun, SHA-3 (subset of Keccak) and Skein can do this. Cool. Do you want me to write a quick program that can write whatever hash you want to a TXT file, however many times you want? And just make it that simple? Hmmm.... it will need nesting functionality won't it?
|
|
|
Post by AnthroHeart on May 27, 2021 3:54:42 GMT
No, the Nesting Utility does that.
The Hash Writer will write your hash string to a file like P0, and then you can add it to your own NESTFILES.ZIP.
The nesting structure is there.
That would be the fast way to do it before I make a more robust app.
|
|
|
Post by reden on May 27, 2021 4:02:09 GMT
No, the Nesting Utility does that. The Hash Writer will write your hash string to a file like P0, and then you can add it to your own NESTFILES.ZIP. The nesting structure is there. That would be the fast way to do it before I make a more robust app. Ah, ok.
|
|
|
Post by AnthroHeart on May 27, 2021 4:05:32 GMT
No, the Nesting Utility does that. The Hash Writer will write your hash string to a file like P0, and then you can add it to your own NESTFILES.ZIP. The nesting structure is there. That would be the fast way to do it before I make a more robust app. Ah, ok. I will get onto it tomorrow. So the Hash Writer will: 1) Ask you to provide the HASH of the NESTFILES.ZIP, or whatever file you want. That way you can use a plethora of hashing functions. 2) Ask how many times to write the HASH to the file P0. 3) Write the hash N times to P0. Pretty easy, but I'll get onto it tomorrow. If you want to wait on further hashing files, I'll get on this ASAP.
|
|
|
Post by reden on May 27, 2021 4:07:16 GMT
SInce -h8 tries to escape the Universe according to you, I might as well wait.
|
|
|
Post by AnthroHeart on May 27, 2021 4:17:00 GMT
SInce -h8 tries to escape the Universe according to you, I might as well wait. A quick web search and a test, gave me this. Easy Peasy. This will write the -h7 hash value to the file P0, 5 times: for n in {1..5}; do echo "b52df8190283cc4829e454bb7a7986a95db9a7c58d21db7ecca57187a1df1cc9" >> P0; done
This is run in a Linux terminal. There may be an equivalent for Windows.
|
|
|
Post by AnthroHeart on May 27, 2021 15:13:15 GMT
SInce -h8 tries to escape the Universe according to you, I might as well wait. A quick web search and a test, gave me this. Easy Peasy. This will write the -h7 hash value to the file P0, 5 times: for n in {1..5}; do echo "b52df8190283cc4829e454bb7a7986a95db9a7c58d21db7ecca57187a1df1cc9" >> P0; done
This is run in a Linux terminal. There may be an equivalent for Windows. reden was this helpful, or is it easier to use VIM?
|
|
|
Post by reden on May 27, 2021 15:19:30 GMT
A quick web search and a test, gave me this. Easy Peasy. This will write the -h7 hash value to the file P0, 5 times: for n in {1..5}; do echo "b52df8190283cc4829e454bb7a7986a95db9a7c58d21db7ecca57187a1df1cc9" >> P0; done
This is run in a Linux terminal. There may be an equivalent for Windows. reden was this helpful, or is it easier to use VIM? I haven't tried it yet. But I know that echo is well optimized. Some OSes' echo implementations may be even more optimized. If it works, then it saves me Vim freezing for some seconds as it struggles to write a line 10M times or delete the whole file at once. Also, in your command, it would be >, which replaces (the entire file), because >> appends. If you used >>, you would have two different hashes in P0.
|
|
|
Post by AnthroHeart on May 27, 2021 15:20:34 GMT
reden was this helpful, or is it easier to use VIM? I haven't tried it yet. But I know that echo is well optimized. Some OSes' echo implementations may be even more optimized. If it works, then it saves me Vim freezing for some seconds as it struggles to write a line 10M times or delete the whole file at once. Also, in your command, it would be >, which replaces (the entire file), because >> appends. If you used >>, you would have two different hashes in P0. Try it both ways. Wouldn't > overwrite the file each iteration, and you end up with only one line? The >> only writes one line at a time. It doesn't duplicate.
|
|
|
Post by reden on May 27, 2021 15:27:10 GMT
I haven't tried it yet. But I know that echo is well optimized. Some OSes' echo implementations may be even more optimized. If it works, then it saves me Vim freezing for some seconds as it struggles to write a line 10M times or delete the whole file at once. Also, in your command, it would be >, which replaces (the entire file), because >> appends. If you used >>, you would have two different hashes in P0. Try it both ways. Wouldn't > overwrite the file each iteration, and you end up with only one line? The >> only writes one line at a time. It doesn't duplicate. You are right. > overwrites the file each iteration. But then you are left with the issue of having to overwrite the older extracted file. A solution is to just create a new P0 for every new zip you make.
|
|
|
Post by AnthroHeart on May 27, 2021 15:33:37 GMT
Yeah, the ZIP references P0 as the start of the stack. So it needs to be that name.
Yeah, create several versions. It's easy. I'm pretty busy with other stuff, but I'm sure you can figure it out.
|
|