Duckduckgo censorship

If DDG goes down this route, may be time to look for a new search engine….


Already started to look for myself. Brave Search seemed like an option. Anyone have an opinion there? I had used StartPage in the past as well.

I’ve used brave, but not enough to have a strong opinion. It works well enough.

But I can’t believe duckduckgo is so blind to the can of worms it’s helping dump all over the internet. I’m very disappointed, I expected a lot more from them.


So, DuckDuckGo has decided to put politics above the interests of its target consumers. People don’t flock to alt-tech for the “Google experience.” Its a shame really, DuckDuckGo was always my go-to search engine. I’m a big fan of the Brave Browser, never used their search engine however.


I have absolutely seen a decline in duckduckgo lately. I hand’t seen that article before now, but it makes sense. It is still my default that comes up in Brave, but more often then not I switch to Startpage to redo the search. I’ll probably be changing it to my default. I am happy with Startpage and I highly recommend it.

1 Like

MetaGer - uses different search machine results. And of course privacy blabla. :slight_smile:

1 Like

i use Ecosia

1 Like

Would definitely prefer flags than downranking. Filtering on flags would be handy to find scientific veracity if I’m not looking for disinformation.

1 Like

Not if they control the flags, you’d have to let users flag items but you can’t trust the internet to behave like adults.


Pretty disappointing. I can see some reason for down-ranking certain content (e.g. DuckDuckGo seems to rank phishing sites over official ones from time to time), but I’m not sure this is the way to go. If we want filtering based on what’s considered “disinformation”, we can just use any of the Big Tech search engines. And who gets to decide what is disinformation? The idea that the Earth revolves around the sun was once “disinformation.”

I’ve never been very impressed with their search results, prompting me to check elsewhere. StartPage gives good results, but can be rather annoying about blocking your searches because they sometimes think you’re a bot if you’re using a VPN and have JavaScript disabled.

8 Likes has surprisingly good search results.


Down-ranking whole sites is problematic. It brings some of the problems of anti-spam to anti-disinformation.

Who knows what “associated with” means?

On the other hand, as the CEO says, privacy does not imply freedom from censorship. Censorship is essentially an integrity failure rather than a privacy failure.



Swisscows might suffer from censorship issues as well, " We place great importance on family-friendly Internet content!"

1 Like

On the other hand, I wouldn’t mind a search engine that gave me an optional toggle:

Remove obviously crap search results from this list. :slightly_smiling_face:


Speaking to that, they should allow a user defined filter. A blacklist I guess, so you could not see anything from Facebook or wherever.


Exactly! It gets tedious typing “” every time.

Which reminds me, another good feature would be a small text box in the menu bar, or a drop-down area, where you could store a bit of often-used text for copy/paste actions. (Maybe there’s already an extension for that.)


I think there’s a bigger issue with “search” than just this change at DDG.

Every search provider effectively has to use some page ranking. There are many questions and problems:

  • Is the page ranking algorithm disclosed? (probably not)
  • Is there human input to page ranking?
  • Is that human input arising solely from users of the search engine or arising solely from staff of the search engine or both sets of humans?
  • What other factors besides human input determine page ranking?

I think most people would be unhappy with a purely random page ranking. You want the information that you are looking for to appear in the first few hits, not in hit #72 on page 8.

Search engine providers would also prefer that the information is in the first few hits because it reduces the load on their systems.

Suppose hypothetically that we had

  • a fully transparent page ranking algorithm
  • no input from search engine staff (hence no explicit censorship)
  • no other factors besides human input (e.g. you can’t pay for high ranking)

So the only input would be human input from users of the service i.e. by indicating that the page that was chosen contained the information that the user was searching for, based on the search term.

That would still be vulnerable to politically- or ideologically-motivated campaigns of manipulation. But let’s put malicious activity to one side.

That would still lead to a form of censorship. Mainstream ideas would still predominate.

Is there a perfect solution that keeps everybody happy?

To me, censorship implies removal, which your analogy does not suggest. I also don’t think censorship is the same as “judging to be irrelevant.”


I can see that, as a strict interpretation. I was just going with the topic as written.

Whether you want to call it censorship or something else isn’t really my point - which was about the problems inherent in page ranking.

(It is of course possible that with input from search engine staff there is explicit censorship, i.e. going beyond down-ranking. Google already does this e.g. in response to DMCA requests and other related requests.)

Down-ranking is a kind of suppression of ideas. By pushing ideas beyond the rank that most people’s patience stops at, the ideas are hidden from view even though never removed as such. Maybe “suppression” or “hiding” work better for you as terms than “censorship”.


The bigger issue is that there’s no open search index which is ready to go right now. And as long as this isn’t solved, searching machines are always bound to the big American and Russian indexes. That’s not only a loose of control (nobody of us know how they filter and rank things), but also a loose of specialized search engines for specific kind of websites or data itself.

It would be more hard to manipulate page rankings, because there would be more diversity of search Machines. And specialized ones would help us to find non common things much easier since unrelated stuff can be filtered by requesting index directly (and not requesting Googles service for example) or can build even more innovative things.

But right now there’s no option other than looking for non manipulative, anonymized searching machine.