Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Podcasts duplicate in UIs #2785

Closed
BlackHoleFox opened this issue Mar 23, 2024 · 16 comments
Closed

[Bug]: Podcasts duplicate in UIs #2785

BlackHoleFox opened this issue Mar 23, 2024 · 16 comments
Labels
bug Something isn't working

Comments

@BlackHoleFox
Copy link

Describe the issue

Audiobookshelf seems to be generating, or somehow finding, duplicate episodes of multiple podcasts I have added and listen to. This is similar to #2122 but doesn't create duplicate files on disk and happens in versions after that bugfix.

Weirdly this doesn't happen with every single podcast in my library, just a few. The ones I notice the most are:

Nothing jumps out in my logs either, though let me know if I should enable debug logs for a week or so to help.

I've deleted duplicates from the library multiple times at this point, but they just seem to be coming back. For example AMCA's library entry has every new episode duplicated right now:
Capture

One of each is not linked to an RSS episode:
Capture

And there are no duplicates on disk. They're purely database/UI located. They show up duplicated in both the web UI and Android app:
Capture

Steps to reproduce the issue

  1. Add https://amorecivilizedage.net/rss as a podcast in your library
  2. Enable Schedule Automatic Episode Downloads. I have it running every day at midnight.
  3. Wait a few days

Audiobookshelf version

2.8.0

How are you running audiobookshelf?

Docker

@BlackHoleFox BlackHoleFox added the bug Something isn't working label Mar 23, 2024
@advplyr
Copy link
Owner

advplyr commented Mar 23, 2024

Can you disable the watcher in server settings as a test to see if you get duplicates on any episodes downloaded after that?

@BlackHoleFox
Copy link
Author

Turned it off and restarted audiobookshelf's container. I'll cleanup the duplicates in the two library items I mentioned and see if they come back.

@BlackHoleFox
Copy link
Author

BlackHoleFox commented Mar 24, 2024

Update: It seems to be the library scanner causing these? It runs periodically which would explain why they show up at a delay. It autoran in the last hour or two and the duplicates showed up. I cleaned out the duplicates from both podcasts once more and manually started a scan from Settings --> Libraries --> Scan. The scan finished and told me some items updated:

image

And then they showed up again in the podcast's library page. All the duplicated ones also showed up in the Newest Episodes/recently added row. The actual newest episode is one to the right off screen:
image

I can reliably reproduce this now with the steps above.

@advplyr
Copy link
Owner

advplyr commented Mar 24, 2024

Can you share some about how you are mapping volumes in docker and what file system/OS you are using?

@BlackHoleFox
Copy link
Author

Yeah, here you are:

  • OS: Ubuntu 22
  • Runtime: podman 3.4.4
  • Filesystem: ext4 but see below for the rest.

My audiobookshelf data is stored between two places: The metadata and config for it is stored on the host but the podcasts themselves are stored elsewhere (for size and reliability reasons) and then mounted via SMB on the host running the container. From there, the podcasts folder is bind mounted into Docker so audiobookshelf can see it. Here's the relevant part of my Dockercompose config:

volumes:
      - /home/user/deployment-data/audiobookshelf/audiobooks:/audiobooks:rw
      - /mnt/media/podcasts:/podcasts:rw
      - /home/user/deployment-data/audiobookshelf/metadata:/metadata
      - /home/user/deployment-data/audiobookshelf/config:/config

@advplyr advplyr added the unable to reproduce Issue is not yet reproducible label Mar 31, 2024
@BlackHoleFox
Copy link
Author

@advplyr Hiyo I'm back with with more info after more digging around. I started under the hypothesis this might have been something specific to my SMB share setup but after reviewing all the audiobookshelf logs with debug logging turned on, I saw nothing pointing at the storage location causing the weird broken episode duplicates.

Starting this off, I double/extra/etc checked there aren't duplicates in the RSS feed:
rss_single

And now: Here's the debug and scanner logs when initiating a scan after removing 1 of the duplicate podcast episodes from the UI. Its a little odd that it says updated here but the scanner log says "new podcast episode".

[LibraryScanner] Library scan e1099d4c-caca-4ea5-a6aa-6508b2343855 completed in 0:07.0 | 0 Added I
1 Updated | 0 Missing
[ApiCacheManager]library.afterUpdate: Clearing cache
[LibraryScan] Scan log saved "/metadata/logs/scans/2024-04-28_e1099d4c-caca-4ea5-a6aa-6508b2343855.txt"
[LibraryController] Scan complete

And then the relevant part of the scanlog. The rest of it is just saying every other podcast is up to date:

{"timestamp":"2024-04-28T04:35:57.662Z","message":"New library file found with path \"/podcasts/shared/A More Civilized Age -  A Star Wars Podcast/82 - Always Two There Are and Brothers of the Broken Horn (Rebels 20 - 21).mp3\" for library item \"A More Civilized Age -  A Star Wars Podcast\"","levelName":"INFO","level":2}
{"timestamp":"2024-04-28T04:35:57.663Z","message":"Library item \"A More Civilized Age -  A Star Wars Podcast\" changed: [size,lastScan]","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.246Z","message":"Mapping metadata to key tagComment => description: "TRIMMED","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.249Z","message":"Mapping metadata to key tagSubtitle => subtitle: TRIMMED","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.251Z","message":"Mapping metadata to key tagDate => pubDate: Wed, 20 Mar 2024 10:00:00 +0000","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.252Z","message":"Mapping metadata to key tagTitle => title: 82: Always Two There Are and Brothers of the Broken Horn (Rebels 20 - 21)","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.253Z","message":"New Podcast episode \"82: Always Two There Are and Brothers of the Broken Horn (Rebels 20 - 21)\" added","levelName":"INFO","level":2}
{"timestamp":"2024-04-28T04:35:58.298Z","message":"Mapping metadata to key tagAlbum => title: A More Civilized Age: A Star Wars Podcast","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.299Z","message":"Mapping metadata to key tagArtist => author: A More Civilized Age","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.301Z","message":"Mapping metadata to key tagGenre => genres: TV & Film:After Shows, Leisure:Animation & Manga","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.302Z","message":"Mapping metadata to key tagLanguage => language: en","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.303Z","message":"Mapping metadata to key tagPodcastType => podcastType: episodic","levelName":"DEBUG","level":1}
{"timestamp":"2024-04-28T04:35:58.311Z","message":"Found metadata file \"/metadata/items/51b244e1-bfd2-4e50-8bc3-a9e81954d099/metadata.json\"","levelName":"INFO","level":2}
{"timestamp":"2024-04-28T04:35:58.407Z","message":"Library item \"Scanline Media Patron Podcasts\" is up-to-date","levelName":"DEBUG","level":1}

When I remove one of these broken episodes, nothing indicates a problem:

[LibraryItem] Library item "51b244e1-bfd2-4e50-8bc3-a9e81954d099" updated

Looking at the all episode list, the normal and broken episode appear (stealth strike). The second goes away temporarily when I delete it from the UI. But it always returns after a library scan.
image

So with not much progress on the surface the database got cloned to my computer and opened as well :) Despite deleting broken episodes in the UI, they still had a database entry, which looks very wrong compared to the correct one:
database_dupe

There's a bunch of "haunted" items like this in my library throughout multiple podcasts, and those all have broken duplicate database entries:
other_examples

The interesting/important thing is that when I deleted the broken record from the podcastEpisodes table, pushed the modified database to my server, and started the docker container it properly scrubbed the haunted episode away. When I scan the library, everything reports up-to-date and the duplicate doesn't reappear. Checking the RSS feed for new episodes doesn't find any junk either.

Pretty puzzled how they got there, honestly. Nothing weird or dangerous has been done to the database or filesystem. Maybe its the result of broken RSS feeds churning and leaving remnants in the database if the creators re-uploaded etc?

So that was a lot, but I'm interested in what you think of the broken database rows as someone who actually knows what everything should look like. Maybe a possible fix could be improving the scanner to look for any rows with a NULL index (and maybe no enclosureSize + enclosureType as well) and deleting them as a cleanup task? I could also manually scrub the database once but honestly these might eventually trickle back in, so an automatic purger could be better?

@advplyr
Copy link
Owner

advplyr commented Apr 28, 2024

The enclosure fields are only populated if the episode was downloaded from an RSS feed. Not all podcast episodes need to come from an RSS feed, you may have audio files in your file system that are a podcast but don't have an associated RSS feed. So we wouldn't want to remove them automatically from the db like you are suggesting.

That is what the duplicates you are seeing are. They are episodes that were scanned in from the file system and were not matched with an existing episode while being scanned in so a new episode was created.

Still more information is needed to figure out what is going on. If you can enable Debug logs in the Logs page of settings this will provide more information during the scan.
Then re-create the issue and we should see where both episodes are being created, one will be from the scanner while the other will be getting created during the RSS feed download.

Also if you can be on the latest version v2.9.0

@BlackHoleFox
Copy link
Author

So we wouldn't want to remove them automatically from the db like you are suggesting.

Got it 👍. Does the same train of thought apply to the NULL index field observation as well?

They are episodes that were scanned in from the file system and were not matched with an existing episode while being scanned in so a new episode was created.

Sounds pretty strange, honestly. I don't manually edit the filesystem Audiobookshelf is working with and get everything from RSS. Could filesystem metadata changes (mtime, ACLs) etc break this (and only for a specific set of episodes)? There also have never been duplicate files for these episodes on disk. It points to the same exact MP3 file the real episode does and is why I can't check the "hard delete" option for these ghost episodes in the UI or it will delete the source file leaving the true episode broken.

If you can enable Debug logs in the Logs page of settings this will provide more information during the scan.

All the logs I shared above were with Debug logging enabled :rip: There wasn't any more detail available from the scan which "brought back" a deleted ghost episode.

Then re-create the issue and we should see where both episodes are being created, one will be from the scanner while the other will be getting created during the RSS feed download.

Yeah I will try, but I still don't know what actually causes these to be created. They have just randomly appeared so far.

Also if you can be on the latest version v2.9.0

I updated a few days ago, so all the logs above come from 2.9.0 :) Appreciate the help so far though.

@BlackHoleFox
Copy link
Author

Tis back with more logs, if they are helpful @advplyr. A new episode of A More Civilized Age came out a few days ago and got duplicated in my library. Nothing about the storage backend for the episodes has changed at all and I haven't modified any files manually.

I am currently running v2.11.0. If you want me to email you a zip file of the last week of logs.

From the logs of the day it downloaded:

{"timestamp":"2024-07-11 00:01:04.698","source":"ffmpegHelpers.js:180","message":"[FfmpegHelpers] downloadPodcastEpisode: Progress estimate 97% (275456 KB) for \"https://traffic.libsyn.com/secure/amorecivilizedage/AMCA_90_mixdown.mp3?dest-id=2530463\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:05.943","source":"ffmpegHelpers.js:180","message":"[FfmpegHelpers] downloadPodcastEpisode: Progress estimate 97% (275838 KB) for \"https://traffic.libsyn.com/secure/amorecivilizedage/AMCA_90_mixdown.mp3?dest-id=2530463\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:05.969","source":"ffmpegHelpers.js:183","message":"[FfmpegHelpers] downloadPodcastEpisode: Complete","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:06.372","source":"LibraryItem.js:310","message":"[LibraryItem] Success saving abmetadata to \"/metadata/items/51b244e1-bfd2-4e50-8bc3-a9e81954d099/metadata.json\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.258","source":"LibraryItem.js:305","message":"[LibraryItem] \"A More Civilized Age -  A Star Wars Podcast\" episode \"90: The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42)\" audioFile was updated from \"[object Object]\" to \"[object Object]\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.259","source":"LibraryItem.js:305","message":"[LibraryItem] \"A More Civilized Age -  A Star Wars Podcast\" episode \"90: The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42)\" extraData was updated from \"null\" to \"[object Object]\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.286","source":"ApiCacheManager.js:21","message":"[ApiCacheManager] podcastEpisode.afterUpdate: Clearing cache","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.307","source":"LibraryItem.js:294","message":"[LibraryItem] \"A More Civilized Age -  A Star Wars Podcast\" episode \"90: The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42)\" was added","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.334","source":"ApiCacheManager.js:21","message":"[ApiCacheManager] podcastEpisode.afterCreate: Clearing cache","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.335","source":"LibraryItem.js:386","message":"[LibraryItem] \"A More Civilized Age -  A Star Wars Podcast\" updatedAt updated from 1720656044551 to 1720656066369","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-11 00:01:07.336","source":"LibraryItem.js:386","message":"[LibraryItem] \"A More Civilized Age -  A Star Wars Podcast\" size updated from 15305796272 to 15588254086","levelName":"DEBUG","level":1}

Following days when re-scans occurred:

{"timestamp":"2024-07-12 00:00:38.037","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Found metadata file \"/metadata/items/5cf3d2e6-22e0-4abc-b255-e07731deae4d/metadata.json\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.079","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Library file \"90 - The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42).mp3\" for library item \"/podcasts/shared/A More Civilized Age -  A Star Wars Podcast\" key \"size\" changed from \"141295616\" to \"282457814\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.081","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Library file \"90 - The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42).mp3\" for library item \"/podcasts/shared/A More Civilized Age -  A Star Wars Podcast\" key \"mtimeMs\" changed from \"1720656007991\" to \"1720656003348\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.082","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Library file \"90 - The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42).mp3\" for library item \"/podcasts/shared/A More Civilized Age -  A Star Wars Podcast\" key \"ctimeMs\" changed from \"1720656007991\" to \"1720656003348\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.084","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Library file \"90 - The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42).mp3\" for library item \"/podcasts/shared/A More Civilized Age -  A Star Wars Podcast\" key \"mtimeMs\" changed from \"1720656065935\" to \"1720656003348\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.085","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Library file \"90 - The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42).mp3\" for library item \"/podcasts/shared/A More Civilized Age -  A Star Wars Podcast\" key \"ctimeMs\" changed from \"1720656065935\" to \"1720656003348\"","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.088","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Library item \"A More Civilized Age -  A Star Wars Podcast\" changed: [size,lastScan]","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:38.137","source":"ApiCacheManager.js:21","message":"[ApiCacheManager] libraryItem.afterUpdate: Clearing cache","levelName":"DEBUG","level":1}
...
{"timestamp":"2024-07-12 00:00:42.068","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Podcast episode \"90: The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42)\" keys changed [audioFile]","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:42.084","source":"ApiCacheManager.js:21","message":"[ApiCacheManager] podcastEpisode.afterUpdate: Clearing cache","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:42.084","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Mapping metadata to key tagComment => description: <p>Rebels season 3 is in full swing. How can you tell? Well, first of all, Uncle Maul is back and he's leading Ezra (and his collection of holocrons) closer to the dark side. And Hera once again shows out, with a killer episode that brings us closer into contact with emerging season villain Admiral Thrawn. And... well... We've got another crack at a Sabine episode. Maybe this time we'll learn something about her, or how she relates to others, or what her past is, or... anything at all. Maybe? Please?</p> <p><a href=\"https://www.patreon.com/civilized\"><em>Support the show by going to Patreon.com/civilized!</em></a></p> <p>NEXT TIME: The Last Battle, Imperial Supercommandos, and Iron Squadron</p> <p>Show Notes</p> <a href=\"https://www.youtube.com/watch?v=TJ9cCxFiEDs\" target=\"_blank\">RIC-1200 Droids | SW Clone Wars &amp; Rebels</a>   <em></em> <em></em> <em></em>   <em>Hosted by Rob Zacny (<a href=\"https://twitter.com/robzacny\">@RobZacny</a>)</em> <p><em>Featuring Alicia Acampora (<a href=\"https://twitter.com/ali_west\">@ali_west</a>), Austin Walker (<a href=\"https://twitter.com/austin_walker\">@austin_walker</a>), and Natalie Watson (<a href=\"https://twitter.com/nataliewatson\">@nataliewatson</a>)</em></p> <p><em>Produced by Ricardo Contreras (<a href=\"https://twitter.com/a_cado_appears\">@a_cado_appears</a>)</em></p> <p><em>Music by Jack de Quidt (<a href=\"https://twitter.com/notquitereal\">@notquitereal)</a></em></p> <p><em>Cover art by Xeecee (<a href=\"https://twitter.com/xeeceevevo\">@xeeceevevo</a>)</em></p> <p> </p>","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:42.085","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Mapping metadata to key tagSubtitle => subtitle: Rebels season 3 is in full swing. How can you tell? Well, first of all, Uncle Maul is back and he's leading Ezra (and his collection of holocrons) closer to the dark side. And Hera once again shows out, with a killer episode that brings us closer into...","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:42.085","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Mapping metadata to key tagDate => pubDate: Wed, 10 Jul 2024 10:00:00 +0000","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:42.085","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Mapping metadata to key tagTitle => title: 90: The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42)","levelName":"DEBUG","level":1}
{"timestamp":"2024-07-12 00:00:42.086","source":"LibraryScan.js:131","message":"[LibraryScan] \"Shared Podcasts\": Podcast episode \"90: The Holocrons of Fate, The Antilles Extraction, and Hera's Heroes (Rebels 40 - 42)\" keys changed [audioFile]","levelName":"DEBUG","level":1}

@advplyr
Copy link
Owner

advplyr commented Jul 14, 2024

I can't think of any reasons for it

@advplyr
Copy link
Owner

advplyr commented Aug 15, 2024

I came across the duplicate podcast episode but after reading through your issue again I don't think it is the same. It is a different bug where the scanner can scan in the audio file while it is being downloaded causing a duplicate episode with bad data like a different duration.

I'm not sure if I mentioned this before elsewhere but Abs uses the inode value of the files to see if they already exist. If you are using CIFS there is a setting that needs to be enabled called serverino in order for it to work properly with Abs.
See this comment and the thread #2509 (comment)

@BlackHoleFox
Copy link
Author

Heyo, very glad to hear you were able to at least reproduce something like it even if we aren't sure its the same. Funnily enough I think we might be on the path though? My current hypothesis is that a race condition is occurring when downloading too :)

Shortly after I made that logging PR (which, thanks for improving on) I started investigating another lead based on the log timestamps. After seeing a bunch of independent component's timestamps close together I looked again at the database and saw that the corrupt episode and the real one were updated within milliseconds of eachother (the one with the GUID is the right one):

image

createdAt is also close:

image

The reason for being highly suspicious of this is because I changed the default scanning time for some of my podcasts ages ago to run every day at midnight. Here's the one for A More Civilized Age, the podcast I've been using in this thread as my example. Along with it is scheduled time for my automatic library scan, which is identical. The scheduled times also match when the episode gets downloaded and duplicated based on the database timestamps:

image
image

I created a test library, also stored on the same CIFS/SMB mount, 5 days ago and gave it identical settings except for making my sample podcast automatically check for downloads every hour instead so that it would download long before midnight and the automatic scan's scheduled time. I was going to post an update here with my results once another episode came out as I wanted to try and test it ""organically"" like it would occur in my main library. But since you seem to be thinking in the same direction as me I'm dumping my notes earlier to maybe help out and save you unneeded work.

If you are using CIFS there is a setting that needs to be enabled called serverino in order for it to work properly with Abs....

Thanks for the reference. Read through the thread and double checked my CIFS/SMB server mount for the podcasts. afaict its had serverino enabled since day 1 with Audiobookshelf:

/mnt/media/podcasts                       //nas.local/Podcasts                 cifs        rw,relatime,vers=3.0,cache=strict,username=readhost,uid=1000,noforceuid,gid=1000,noforcegid,addr=nas.local,file_mode=0755,dir_mode=0755,soft,nounix,serverino,mapposix,rsize=4194304,wsize=4194304,bsize=1048576,echo_interval=60,actimeo=1,closetimeo=1

Figured it couldn't hurt to look at the recorded inode's of the latest corrupted episode, and they and the filenames match, whatever that's worth:
image

@BlackHoleFox
Copy link
Author

I return, again, hello. I've concluded the test that I mentioned I was going to experiment with a few weeks ago. My test library seems to now support the theory I wrote out before about a race condition. A new episode of my sample podcast came out a few days ago and so I went to go check on how both my libraries handled it. The results came out as hypothesized :)

You can see here below that the real library ended up with yet another duplicate for this episode. But the test library (which is also being stored on my NAS via SMB) did not duplicate anything at all:

image
image

Again, the only difference between these two is the time they check for episodes daily in the podcast's configuration. This seems like decently strong evidence supporting the theory of a race condition, and maybe is something that you could try and configure yourself locally to reproduce with 100% certainty.

I also made sure some other podcasts had their automatic download timer staggered compared to the library-wide scan and I have not seen any duplicates arrive from them either.

@advplyr advplyr added the awaiting release Issue is resolved and will be in the next release label Nov 7, 2024
@advplyr
Copy link
Owner

advplyr commented Nov 7, 2024

I was able to fix the issue where the scanner runs while an episode is being downloaded causing a duplicate

@BlackHoleFox
Copy link
Author

Hurray, thanks 🎉. Will give it a spin once a new release goes out.

Copy link

Fixed in v2.17.0.

@github-actions github-actions bot removed the awaiting release Issue is resolved and will be in the next release label Nov 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants