In case your local government decides to collect every single domain or url you visited: Why not make it even more ineffective and costly to do so?
This tool runs on your normal desktop pc or mac and tries to surf like a human would do. It generates a constant trace of domains and urls your provider has to collect for security reasons. Lets see how many more disks he is willing to spend on that nonsense.
We use google news feeds for a varying source of websites, hoping this will make the visited sites somewhat secure to visit.
The tool might get more intelligent, hide its existence in a better way or will use less memory. Java and its libs are prone to eat all your memory. I’m working on this.