Google has reportedly removed much of Twitter’s links from its search results after the social network’s owner Elon Musk announced reading tweets would be limited.

Search Engine Roundtable found that Google had removed 52% of Twitter links since the crackdown began last week. Twitter now blocks users who are not logged in and sets limits on reading tweets.

According to Barry Schwartz, Google reported 471 million Twitter URLs as of Friday. But by Monday morning, that number had plummeted to 227 million.

“For normal indexing of these Twitter URLs, it seems like these tweets are dropping out of the sky,” Schwartz wrote.

Platformer reported last month that Twitter refused to pay its bill for Google Cloud services.

  • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    edit-2
    1 year ago

    Elon going to complain about another conspiracy going on while in reality it’s just that when crawlers are not able to open a certain URL they simply assume that the page doesn’t exist anymore. Google certainly didn’t “retaliate”, bots simply couldn’t find those pages anymore.

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      35
      ·
      1 year ago

      The headline is actually wrong. Google did not do anything to Twitter. Twitter fucked up their own SEO by removing access to its content.

      • Instigate@aussie.zone
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 year ago

        Yeah, that’s a pretty easy and reasonable conclusion to come to if you think about if for more than five seconds. I’m not sure Elon has any toes left after he keeps shooting himself in the feet.

    • bingbong@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      The latest in a seemingly never-ending series of self-owns. Apart from the stress it must put on their devs, it’s been entertaining

    • coffeetest@kbin.social
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Crawl issues I am sure but also user experience issues. Google is sensitive to sending visitors to sites where metrics indicate users do not, like bounce rates etc. I don’t use twt but if it is the case you have the be logged in to see anything now, a non-logged in user will click a link from Google hit a login page, and use the back button. I would assume Google will see that as a bad search result and use it less.

      • Pseu@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        If I were making a web crawler, I would make it so that if a crawler finds a domain that appears to have changed dramatically or gone offline it will re-crawl the domain and flag already-crawled pages as potentially obsolete.