FreedomAdvocate

joined 8 months ago
MODERATOR OF
[–] FreedomAdvocate 0 points 3 months ago (1 children)

That was Biden.

[–] FreedomAdvocate 3 points 3 months ago

This looks absolutely terrible to use as a tv. 350nits brightness alone makes it unusable. No HDR, only 60hz, terrible contrast too.

[–] FreedomAdvocate 1 points 3 months ago

Liberals have been the ones doing it, which is why no one cares or fears being called a Nazi, racist, bigot, -phobe, fascist, or any of the other overused leftist favourite names to call anyone who doesn’t agree with them anymore.

It was working, it radicalised their loyalists and scared a lot of regular people into submission, but like you said - overuse, especially when it was clearly not appropriate, made the tactic less and less effective.

[–] FreedomAdvocate -2 points 3 months ago (5 children)

I love when people point to China and their “renewables” and batteries as an example, and just ignore the fact that they’re building dozens of new coal power plants every year, along with nuclear plants which the people pushing these articles refuse to even consider.

[–] FreedomAdvocate 2 points 3 months ago

It shouldn’t be Google who control which images they return in Google search? It’s their product……

[–] FreedomAdvocate 1 points 3 months ago

Completely wrong.

They don’t need to keep the images because they hash them. They store the hashes - that’s the point. CSAM detection works the same way.

If your hash matches the database hash (on 2 or more databases), then it will be flagged for manual review. They don’t need to know which image it matched, because they look at your image and go “yeh that’s an intimate image so it’s a match”.

[–] FreedomAdvocate 2 points 3 months ago

But you said that for 2 days a month you have this problem that you started this topic over?

2 days a month x 12 months = 24 days, which is just over 3 weeks a year lol

[–] FreedomAdvocate 9 points 3 months ago

You guys are all acting like this “technology” is new lol. It’s the exact same way that all of the big companies detect CSAM - they have databases of hashes of known CSAM images, and every time you upload a file to their servers they hash your image and compare it to their database. If your uploads get a few matches, they flag your account for manual investigation.

All this is doing is applying the same process for other types of images - non consensual intimate images, or “revenge porn” as it’s more commonly known.

CSAM has systems in place to prevent abuse in the way you mention, as it uses databases managed by separate companies all over the world, and it has to match on multiple databases precisely to stop it from being abused that way. I would assume this is the same.

[–] FreedomAdvocate -2 points 3 months ago (5 children)

This is cool, but it’s not a replacement for notepad.

[–] FreedomAdvocate -2 points 3 months ago

Your sexual attraction has nothing to do with gender - yours or anyone else’s - but with sex.

Homosexuality is same sex attraction.

Bisexual is attraction to both sexes.

Asexual is attraction to neither sex.

Heterosexual is attraction to the opposite sex.

Everything outside of that is really just people trying to give themselves a label to stand out.

view more: ‹ prev next ›