• CheeseNoodle@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 day ago

    Me: Wow this is terrible, I should download a local copy of wikipedia, its only like 30gig right?

    Wikipedia: Heres a nepenthies like trap of pages that don’t actually have a download link and all just link back to eachother, also the actual archive is in a format no ones ever heard of and needs a dedicated reader AND a dedicated very suspicious looking torrenting software to download in the first place.

    I still haven’t figured that shit out, Every now and then theres this push to get people to back it up locally but then it seems like they deliberately make it as hard as they possibly can to do so.

    • ILikeTraaaains@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      It has been a lot since I needed to download Wikipedia, it is very easy, AFAIK there is Wikimedia with backups of all wiki sites.

      The weird file sizes are just a compressed file format, sql and XML. Maybe it is a bit more complex run Wikipedia locally, but the content information is easy to retrieve.

      The only issue is that sql and xml are plain text files, and plain text compresses very well, so a 30GB backup can become easily 100GB uncompressed.

    • InFerNo@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago

      You can also install wikimedia and download the database, but then you’d need a webserver locally to host it.