I guess we all kinda knew that, but it’s always nice to have a study backing your opinions.

  • @UnderpantsWeevil
    link
    English
    010 months ago

    keeping the measurements hidden in order to make it harder for them to become a goal is a decent way to go on about it.

    The measure, from the perspective of Clickbaiters, is purely their own income stream. And there’s no way to hide that from the guy generating the clickbait.

    How would you measure “without ads”?

    We have a well-defined set of sites and services that embed content within a website in exchange for payment. An easy place to start is to look for these embeds on a website and downgrade the results in your query as a result. We can also see, from redirects and ajax calls off a visited website, when lots of other information is being drawn in from third-party sites. That’s a very big red flag on a site that’s doing ad pop-ups/pop-overs and other gimmicks.

    I’m not sure it’s possible to find a good completely open source solution that’s not either giving bad results by down rating good results for the wrong reasons or that’s open to misuse by SEO.

    I would put more faith in an open-source solution than a private model, purely due to the financial incentives involved in their respective creations. The challenge with an open model is in getting the space and processing power to do all the web-crawling.

    After that, it wouldn’t be crazy to go in the Wikipedia/Reddit direction and have user-input to grade your query results, assuming a certain core pool of reliable users could be established.