Ok. I’m getting tired. You bested me this round. Have a nice day.
Ok. I’m getting tired. You bested me this round. Have a nice day.
You say it’s the goal of the proletariat to protect the revolution, but why would they? Each proletariat would benefit from the revolution’s failure- they could live better lives as the bourgeois. You talk about the proletariat like they are some monolithic entity, with a single mind and goal. You talk big about helping the individual, but cannot see beyond their class. The proletariat is a person, with needs, desires and opinions. What father would hold the abstract ideals of the “revolution” over the life of his sick daughter? Any father I know would do anything for the safety of his children, even hoard life-saving medicine from others.
Communist logix
we need to abolish private property so everybody has equal power.
we class of people to maintain public ownership
After all, how can we enforce public ownership without a more powerful class of enforcers?
I’m pretty sure fused add multiply with store is part of the AVX instruction set.
Would you prefer it if the meme contained all the suicide hotlines for every country?
Recursion makes it cheaper to run in the dev’s mind, but more expensive to run on the computer. Subroutines are always slower than a simple jump.
Hand written assembly is much more powerful than a turing-complete high level language because it lets you fuck up everything. Rust and python are way too wimpy to allow a user to destroy their computer.
So you made a meme about how your opponent is completely irrational and you are a paragon of logic and reason, and then proceeded to declare yourself the winner?
I completely agree that it’s a stupid way of doing things, but it is how openai reduced the vocab size of gpt-2 & gpt-3. As far as I know–I have only read the comments in the source code– the conversion is done as a preprocessing step. Here’s the code to gpt-2: https://github.com/openai/gpt-2/blob/master/src/encoder.py I did apparently make a mistake, as the vocab reduction is done through a lut instead of a simple mod.
Can’t find the exact source–I’m on mobile right now–but the code for the gpt-2 encoder uses a utf-8 to unicode look up table to shrink the vocab size. https://github.com/openai/gpt-2/blob/master/src/encoder.py
This might be happening because of the ‘elegant’ (incredibly hacky) way openai encodes multiple languages into their models. Instead of using all character sets, they use a modulo operator on each character, to make all Unicode characters represented by a small range of values. On the back end, it somehow detects which language is being spoken, and uses that character set for the response. Seeing as the last line seems to be the same mathematical expression as what you asked, my guess is that your equation just happened to perfectly match some sentence that would make sense in the weird language.
Anything that’s turning complete, has enough ram, and has a c compiler can run Linux. Theoretically, you could program a CPLD to run brainfuck and you could still run Linux.
You can’t choose where you grow up. :(