Duckduckgo censorship

On the other hand, I wouldn’t mind a search engine that gave me an optional toggle:

Remove obviously crap search results from this list. :slightly_smiling_face:

6 Likes

Speaking to that, they should allow a user defined filter. A blacklist I guess, so you could not see anything from Facebook or wherever.

5 Likes

Exactly! It gets tedious typing “-facebook.com -instagram.com -youtube.com -twitter.com” every time.

Which reminds me, another good feature would be a small text box in the menu bar, or a drop-down area, where you could store a bit of often-used text for copy/paste actions. (Maybe there’s already an extension for that.)

3 Likes

I think there’s a bigger issue with “search” than just this change at DDG.

Every search provider effectively has to use some page ranking. There are many questions and problems:

  • Is the page ranking algorithm disclosed? (probably not)
  • Is there human input to page ranking?
  • Is that human input arising solely from users of the search engine or arising solely from staff of the search engine or both sets of humans?
  • What other factors besides human input determine page ranking?

I think most people would be unhappy with a purely random page ranking. You want the information that you are looking for to appear in the first few hits, not in hit #72 on page 8.

Search engine providers would also prefer that the information is in the first few hits because it reduces the load on their systems.

Suppose hypothetically that we had

  • a fully transparent page ranking algorithm
  • no input from search engine staff (hence no explicit censorship)
  • no other factors besides human input (e.g. you can’t pay for high ranking)

So the only input would be human input from users of the service i.e. by indicating that the page that was chosen contained the information that the user was searching for, based on the search term.

That would still be vulnerable to politically- or ideologically-motivated campaigns of manipulation. But let’s put malicious activity to one side.

That would still lead to a form of censorship. Mainstream ideas would still predominate.

Is there a perfect solution that keeps everybody happy?

To me, censorship implies removal, which your analogy does not suggest. I also don’t think censorship is the same as “judging to be irrelevant.”

2 Likes

I can see that, as a strict interpretation. I was just going with the topic as written.

Whether you want to call it censorship or something else isn’t really my point - which was about the problems inherent in page ranking.

(It is of course possible that with input from search engine staff there is explicit censorship, i.e. going beyond down-ranking. Google already does this e.g. in response to DMCA requests and other related requests.)

Down-ranking is a kind of suppression of ideas. By pushing ideas beyond the rank that most people’s patience stops at, the ideas are hidden from view even though never removed as such. Maybe “suppression” or “hiding” work better for you as terms than “censorship”.

2 Likes

The bigger issue is that there’s no open search index which is ready to go right now. And as long as this isn’t solved, searching machines are always bound to the big American and Russian indexes. That’s not only a loose of control (nobody of us know how they filter and rank things), but also a loose of specialized search engines for specific kind of websites or data itself.

It would be more hard to manipulate page rankings, because there would be more diversity of search Machines. And specialized ones would help us to find non common things much easier since unrelated stuff can be filtered by requesting index directly (and not requesting Googles service for example) or can build even more innovative things.

But right now there’s no option other than looking for non manipulative, anonymized searching machine.

Well, yes, but what do the hardware requirements look like for an index of the entire public web? I expect that the only way that it could be done “open” is if it is a distributed system.

Open Search Foundation said:

The first steps have already been taken. Together with experts from European computer centres and research institutions, we are promoting decentralised indexing experiments and the development of advanced concepts.

Sounds like still a long way, but with cooperations of many companies and institutes there’s a way of decentralization with combined powerful hardware. I guess it’s the only realistic way for such hardware requirements.

2 Likes

I’m using swisscows, and not having problems so far.

Speaking of “censorship,” Google is the king, as anyone who uses Google search directly only sees what Google decides to show.

3 Likes

I’m getting the feeling there isn’t a viable alternative search engine that doesn’t censor results? Eventually all we see will be completely controlled by some entity (or small handful of entities) using methods we don’t really understand to push an opinion or agenda that goes against true freedom of information?

The end user should be the only one deciding what is, or is not, misinformation, or what is, or is not, relevant information. I don’t think anyone here wants someone else deciding that for them, but I could be wrong.

2 Likes

Maybe a metasearch engine like searx? At least it pulls results from multiple sources.

EDIT: A nice option that searx makes available: Neocities / Random SearX Redirector

Another option: Make your own searX instance.

3 Likes

I am very anti-censorship, but it is difficult to imagine any search engine without it, decentralized and open or otherwise. Curating information is inherent to the function of a search engine. The “censorship” might be ideologically driven to a greater or lesser degree, but I’m not sure a search engine without it is desirable. If someone is maliciously creating thousands of impostor web pages to “hide” a legitimate page, shouldn’t those pages be “censored”?

3 Likes

If “censorship” can be based on fact, then yes. It seems to me it is almost universally based on opinion.

1 Like

Metager let’s you define user filters. I don’t let it return hits from Microsoft sites, so I can counter Bing a little.

4 Likes

The most insulting thing I read about the DuckDuckGo disappointment was something one of their leadership said… they claimed it wasn’t really censorship, it’s “just search rankings”. Pretending that this isn’t effectively the same thing when they rank a site artificially low so that it never gets seen.

How stupid do they think we are?

I tend to use Presearch now. Can be a bit slow at times but I like the tech.

5 Likes

It could be argued that they should be “censored” on the client side so that it is 100% within your control as to what you see and what you don’t see. That is, in the index no pages are ever censored. In terms of the pages (URLs) that the server provides to the client no pages are ever censored.

That of course implies that there is a client. However if we are imagining a new distributed open search implementation then we can also imagine an open source client for it.

2 Likes

I mean, it’s pretty easy to imagine, or even remember if you’re old enough… just go back to the time when search engines indexed every site page of every site and when you searched for something the first hundred or so results were the different pages of the same site and results were ranked in order of what came back first from the index not what was most relevant.

I’d be curious how many people are wanting to go back to the good old days of searching through results by the tens of pages hoping you didn’t skip what you were looking for and how many just don’t understand what they’re asking for when they say no censorship of any kind because they didn’t live through it.

3 Likes

But what is if you want to find such fake pages … maybe for research purpose or what ever? Of course a default search engine shouldn’t do that. How ever, a possibility to look for unusual stuff is sometimes a legit requirement. That’s why I’m advertising so much for an open search index. When we have such thing, we can use many search engines for different purpose like daily usage (don’t display fake pages), results for only little pages (excludes big known ones), fake page machine and many other things.

The difference between censorship and manipulated ranking order (for example to filter malicious fake pages) is if there are alternatives to choose what ever you want. But nowadays there’s no real alternative since everything is Google, Bing, Yandex etc and machines that are build on top. That comes at least close to censorship, even if it’s usually no problem for us.