• adO.Nis@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    28 days ago

    Fuck… Now that RAM prices are skyrocketing, we gonna see hoarders buy hundreds of TB of storage, leading to price hikes

  • morto@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    29 days ago

    It would be awesome if we had an app that allowed to stream directly from such torrents, and had a user-made recommendation system to replace the discovery algorithm :D

    • Bizzle@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      29 days ago

      Stremio + Torrentio does this for TV but I haven’t found an equivalent for music. Hoping to be proven wrong 🤞

    • Valmond@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      28 days ago

      I wonder how they are splitting it up in different torrents and how many.

      They are splitting it up right? 😁

  • souperk@reddthat.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    27 days ago

    Has anyone tried to self host this? Of course, hosting 300tb isn’t practical, so any solution would need to download the metadata and songs on demand.

  • Knock_Knock_Lemmy_In@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    I have 3gb of space to share. Are there 150 other people like me that want to do some distributed hosting? What technology can handle this?

    • Piece_Maker@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      28 days ago

      Torrents manage this without any extra tech. Just grab the .torrent file, and only select the files you have space for. Download them then seed, and get your 150 friends to do different files. If I then go to download a file that’s in your batch, the download will happen from your server (and whoever else is seeding these files), and if I go to download a file that’s on one of your 150 others’ server it’ll download from there.

      • Knock_Knock_Lemmy_In@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        28 days ago

        Yes, almost. But i want it to seem like i have all 400tb on my 3tb drive. I want the cache to handle downloads automatically and I want an even spread of files so that none get lost.

        Torrents give me:

        • Chunking
        • Redundancy via multiple seeders
        • Partial downloads

        But they do not give me:

        • A unified filesystem view
        • Automatic caching & eviction
        • Guarantees that every file stays available
        • User side load balancing or placement control
        • A way to say “this file must exist on N peers”

        Maybe I should be typing this into a LLM.

        Edit: ChatGPT is suggesting an IPFS Cluster

        Edit 2: If the torrents stays active I can remove some requirements

  • blitzen@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    29 days ago

    As far as I’ve read, the database is largely low bitrate files, and some AI. The value here is metadata and preservation of “rare” music.

    • selokichtli@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      29 days ago

      It’s not lossless but current ogg vorbis at 160kbps is absolutely transparent for the vast majority of people. That’s actually what I chose to keep my own collection, I mean, outside of the lossless albums that I absolutely want to flawlessly preserve.

  • Cocodapuf@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    28 days ago

    How many full seeds are there? I mean how many could there be? Who has 300 Tb to throw at this?

    • Auth@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      28 days ago

      Given some of the collections I’ve seen on private trackers I’d say there is going to be quite a few seeding this in its entirety.

      • HeyJoe@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        29 days ago

        300tb is a lot, but its kind of crazy to think this entire company only needs 300tb storage arrays to function. I wonder how they handle things internally. I would imagine at least 1 backup server ready to go in HA. I wonder if they have multiple regions across the country that also serves up the same setup.

        • 🦄🦄🦄@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          29 days ago

          Afaik 300 TB is just the most popular music and around a third of all tracks. The blog post on anna’s is quite entertaining tho.

        • rainwall@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          29 days ago

          Likely cloned Netflix’s “netflix in a box” design, where they drop a large 200TB+ NAS in thousands of different CDN datecenters with their most popular content cached so that total traffic is minimal across the internet at large.

          Spotify mainly being music with very little video likely makes this even easier.

        • JohnEdwa@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          29 days ago

          IIRC there’s still like 700TB of low popularity music missing, but it is only something like 0.4% of listens.
          And they need a more storage overall because they have to set up datecenters around the world - doesn’t make sense to stream tens of millions of connections across the ocean. But that also gives all the backups one would need for “free”.