|
Post by reden on May 24, 2021 14:39:12 GMT
"Is that bold part the hash itself?" Yes. "If we can HASH, and then build that HASH up to GB in size, and then HASH again and repeat, I wonder if it can be infinite?" It is plausible. What size HASH would we need to reduce the risk of collisions? Now we're in cryptography. Fascinating. crypto.stackexchange.com/questions/47809/why-havent-any-sha-256-collisions-been-found-yet Even sha256 would work according to this link. No one has found true collisions, only partial ones. Maybe SHA3.
|
|
|
Post by AnthroHeart on May 24, 2021 14:41:02 GMT
I mean, if you can create a large hash in seconds of a GB file, would it be better in this case to go BIG?
|
|
|
Post by reden on May 24, 2021 14:42:07 GMT
If it is feasible, I'll create a hashing engine later to loop the hash, build it up to whatever size, and hash and repeat. Wait... can't we already do this for menctioning the NEST files? Menction the NEST file 10000 times, then zip it up, then menction that 10000 times more... Could that work?
|
|
|
Post by AnthroHeart on May 24, 2021 14:44:38 GMT
Probably would work. Might be better than doing a GB string.
You could probably modify my Nesting File Creation Utility to do HASHES.
I'm a bit mentally exhausted now, and a little tipsy from energy, so maybe you can look into it.
I used Mozilla style with clang-format cause it's my favorite one so far.
|
|
|
Post by reden on May 24, 2021 20:57:02 GMT
I made an 86 MB file that has both hashes and file name menctions, and put it in HSUPLINK.TXT. I ran it hours ago, still running it now.
|
|
|
Post by AnthroHeart on May 24, 2021 21:05:15 GMT
Interesting, both Repeater and Nesting utility are hitting 100% CPU and only using about 18% of RAM together.
|
|
|
Post by AnthroHeart on May 24, 2021 21:05:40 GMT
Could you do some intentions to help me with my transformation dreams too?
|
|
|
Post by reden on May 24, 2021 21:08:02 GMT
Could you do some intentions to help me with my transformation dreams too? Such as? You have access to the 10-1Mreps file.
|
|
|
Post by AnthroHeart on May 24, 2021 21:09:01 GMT
I'll look into it. Thanks.
Ok thanks. I'll look into it.
|
|
|
Post by AnthroHeart on May 24, 2021 22:20:52 GMT
reden your hash.txt is really awesome. I'm running that now on my INTENTIONS.TXT and getting 28.6 PHz with 3.5 GB of RAM.
|
|
|
Post by reden on May 24, 2021 22:22:05 GMT
The hash only? Edit: I tried to make one in the you-know-where, but I made it 890 MB instead. Here is the 86 MB version. Attachments:hash.zip (315.51 KB)
|
|
|
Post by AnthroHeart on May 24, 2021 22:24:37 GMT
Yep, the one named hash.txt. It feels strong, but is it a hash of the nesting file referencing INTENTIONS.TXT? The one that is 890989090 in size.
|
|
|
Post by reden on May 24, 2021 22:26:32 GMT
Yep, the one named hash.txt. It feels strong, but is it a hash of the nesting file referencing INTENTIONS.TXT? The one that is 890989090 in size. It is. That one is the 890 MB version. Edit: It's both the hash and the zip file name.
|
|
|
Post by AnthroHeart on May 24, 2021 22:27:38 GMT
How did you make the HASH that big?
|
|
|
Post by reden on May 24, 2021 22:34:04 GMT
How did you make the HASH that big? Manual copying process on vim. I copied the single statement 10000 lines iirc on a single line. I think I had wanted to do just 1000 to make space costs smaller (this was on my laptop) I overspecified the line number and it copied it 1000 times so it became 890 MB. Remember that it also includes the file name, I copied it that way when shasum showed it to me. Maybe it doesn't need the file name.
|
|