I was just wondering about a general privacy goal of having an LLM bot just flood the zone with random data to try and confound advertising models, simulating clicks and likes/engagement across the spectrum just to wreck any meaningful data correlations.
If you were aiming this concept at two specific targets, i.e., costing the Trump campaign money and screwing with their data, things could get really interesting. Like an open source bot that would coordinate bizarre trends across large cohorts of users to convince the data miners that, for example, a disproportionate number of voters in key regions are demographically or behaviorally skewed.
As online advertising becomes ever more ubiquitous and unsanctioned, AdNauseam works to complete the cycle by automating ad clicks universally and blindly on behalf of its users. Built atop uBlock Origin, AdNauseam quietly clicks on every blocked ad, registering a visit on ad networks’ databases. As the collected data gathered shows an omnivorous click-stream, user tracking, targeting and surveillance become futile.
It should atleast poison any data they gather about you right? Since you’re not ever going to realistically click on any of these ads, it would now look like anything and everything interests you
Personally, I’d just limit it to feeding them data that a large undecided segment believes a few provably false outlandish things, so that they publicly endorse said things when they could be spending time doing something socially destructive.
I like the idea, but I’d worry about getting sued for fraud. Though it’s not likely that would be a top issue what with his trying to stay out of prison.
I’m not a lawyer but I’m not sure how liable you’d be. People run bots all the time. Plus, this is all about numbers. You can’t sue thousands of people like that.
I was just wondering about a general privacy goal of having an LLM bot just flood the zone with random data to try and confound advertising models, simulating clicks and likes/engagement across the spectrum just to wreck any meaningful data correlations.
If you were aiming this concept at two specific targets, i.e., costing the Trump campaign money and screwing with their data, things could get really interesting. Like an open source bot that would coordinate bizarre trends across large cohorts of users to convince the data miners that, for example, a disproportionate number of voters in key regions are demographically or behaviorally skewed.
Take a look at this browser extension: https://adnauseam.io/
Ooh, that’s nice. Now change “omnivorous” to “targeted” and things get interesting.
Thanks, friend!
Do those clicks appear to be coming from me, or from some random fake identity?
False floods of data or not, there are some things I’d rather not have any identifiable contact with at all.
From my understanding, they appear to be coming from you- that’s kinda the point.
It should atleast poison any data they gather about you right? Since you’re not ever going to realistically click on any of these ads, it would now look like anything and everything interests you
That could get you on some interesting lists.
Personally, I’d rather not have any database think I was interested in certain topics, no matter how false that data is.
Personally, I’d just limit it to feeding them data that a large undecided segment believes a few provably false outlandish things, so that they publicly endorse said things when they could be spending time doing something socially destructive.
Can we get the politicians to shift from illegal aliens to Sasquatch? Build a wall across Washington state and make Canada pay for it?
Holy shit… This would be amazing.
I mean wouldn’t it be crazy, to get something like people are eating cats and dogs publicly said.
I like the idea, but I’d worry about getting sued for fraud. Though it’s not likely that would be a top issue what with his trying to stay out of prison.
I’m not a lawyer but I’m not sure how liable you’d be. People run bots all the time. Plus, this is all about numbers. You can’t sue thousands of people like that.