feat(scanner): implement selective folder scanning and file system watcher improvements (#4674)

* feat: Add selective folder scanning capability

Implement targeted scanning of specific library/folder pairs without
full recursion. This enables efficient rescanning of individual folders
when changes are detected, significantly reducing scan time for large
libraries.

Key changes:
- Add ScanTarget struct and ScanFolders API to Scanner interface
- Implement CLI flag --targets for specifying libraryID:folderPath pairs
- Add FolderRepository.GetByPaths() for batch folder info retrieval
- Create loadSpecificFolders() for non-recursive directory loading
- Scope GC operations to affected libraries only (with TODO for full impl)
- Add comprehensive tests for selective scanning behavior

The selective scan:
- Only processes specified folders (no subdirectory recursion)
- Maintains library isolation
- Runs full maintenance pipeline scoped to affected libraries
- Supports both full and quick scan modes

Examples:
  navidrome scan --targets "1:Music/Rock,1:Music/Jazz"
  navidrome scan --full --targets "2:Classical"

* feat(folder): replace GetByPaths with GetFolderUpdateInfo for improved folder updates retrieval

Signed-off-by: Deluan <deluan@navidrome.org>

* test: update parseTargets test to handle folder names with spaces

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(folder): remove unused LibraryPath struct and update GC logging message

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(folder): enhance external scanner to support target-specific scanning

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): simplify scanner methods

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(watcher): implement folder scanning notifications with deduplication

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(watcher): add resolveFolderPath function for testability

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(watcher): implement path ignoring based on .ndignore patterns

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): implement IgnoreChecker for managing .ndignore patterns

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(ignore_checker): rename scanner to lineScanner for clarity

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): enhance ScanTarget struct with String method for better target representation

Signed-off-by: Deluan <deluan@navidrome.org>

* fix(scanner): validate library ID to prevent negative values

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): simplify GC method by removing library ID parameter

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(scanner): update folder scanning to include all descendants of specified folders

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(subsonic): allow selective scan in the /startScan endpoint

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): update CallScan to handle specific library/folder pairs

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): streamline scanning logic by removing scanAll method

Signed-off-by: Deluan <deluan@navidrome.org>

* test: enhance mockScanner for thread safety and improve test reliability

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): move scanner.ScanTarget to model.ScanTarget

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor: move scanner types to model,implement MockScanner

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): update scanner interface and implementations to use model.Scanner

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(folder_repository): normalize target path handling by using filepath.Clean

Signed-off-by: Deluan <deluan@navidrome.org>

* test(folder_repository): add comprehensive tests for folder retrieval and child exclusion

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): simplify selective scan logic using slice.Filter

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): streamline phase folder and album creation by removing unnecessary library parameter

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): move initialization logic from phase_1 to the scanner itself

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(tests): rename selective scan test file to scanner_selective_test.go

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(configuration): add DevSelectiveWatcher configuration option

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(watcher): enhance .ndignore handling for folder deletions and file changes

Signed-off-by: Deluan <deluan@navidrome.org>

* docs(scanner): comments

Signed-off-by: Deluan <deluan@navidrome.org>

* refactor(scanner): enhance walkDirTree to support target folder scanning

Signed-off-by: Deluan <deluan@navidrome.org>

* fix(scanner, watcher): handle errors when pushing ignore patterns for folders

Signed-off-by: Deluan <deluan@navidrome.org>

* Update scanner/phase_1_folders.go

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* refactor(scanner): replace parseTargets function with direct call to scanner.ParseTargets

Signed-off-by: Deluan <deluan@navidrome.org>

* test(scanner): add tests for ScanBegin and ScanEnd functionality

Signed-off-by: Deluan <deluan@navidrome.org>

* fix(library): update PRAGMA optimize to check table sizes without ANALYZE

Signed-off-by: Deluan <deluan@navidrome.org>

* test(scanner): refactor tests

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(ui): add selective scan options and update translations

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(ui): add quick and full scan options for individual libraries

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(ui): add Scan buttonsto the LibraryList

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(scan): update scanning parameters from 'path' to 'target' for selective scans.

* refactor(scan): move ParseTargets function to model package

* test(scan): suppress unused return value from SetUserLibraries in tests

* feat(gc): enhance garbage collection to support selective library purging

Signed-off-by: Deluan <deluan@navidrome.org>

* fix(scanner): prevent race condition when scanning deleted folders

When the watcher detects changes in a folder that gets deleted before
the scanner runs (due to the 10-second delay), the scanner was
prematurely removing these folders from the tracking map, preventing
them from being marked as missing.

The issue occurred because `newFolderEntry` was calling `popLastUpdate`
before verifying the folder actually exists on the filesystem.

Changes:
- Move fs.Stat check before newFolderEntry creation in loadDir to
  ensure deleted folders remain in lastUpdates for finalize() to handle
- Add early existence check in walkDirTree to skip non-existent target
  folders with a warning log
- Add unit test verifying non-existent folders aren't removed from
  lastUpdates prematurely
- Add integration test for deleted folder scenario with ScanFolders

Fixes the issue where deleting entire folders (e.g., /music/AC_DC)
wouldn't mark tracks as missing when using selective folder scanning.

* refactor(scan): streamline folder entry creation and update handling

Signed-off-by: Deluan <deluan@navidrome.org>

* feat(scan): add '@Recycle' (QNAP) to ignored directories list

Signed-off-by: Deluan <deluan@navidrome.org>

* fix(log): improve thread safety in logging level management

* test(scan): move unit tests for ParseTargets function

Signed-off-by: Deluan <deluan@navidrome.org>

* review

Signed-off-by: Deluan <deluan@navidrome.org>

---------

Signed-off-by: Deluan <deluan@navidrome.org>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: deluan <deluan.quintao@mechanical-orchard.com>
This commit is contained in:
Deluan Quintão 2025-11-14 22:15:43 -05:00 committed by GitHub
parent bca76069c3
commit 28d5299ffc
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
52 changed files with 3221 additions and 374 deletions

View File

@ -4,10 +4,12 @@ import (
"context"
"encoding/gob"
"os"
"strings"
"github.com/navidrome/navidrome/core"
"github.com/navidrome/navidrome/db"
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/persistence"
"github.com/navidrome/navidrome/scanner"
"github.com/navidrome/navidrome/utils/pl"
@ -17,11 +19,13 @@ import (
var (
fullScan bool
subprocess bool
targets string
)
func init() {
scanCmd.Flags().BoolVarP(&fullScan, "full", "f", false, "check all subfolders, ignoring timestamps")
scanCmd.Flags().BoolVarP(&subprocess, "subprocess", "", false, "run as subprocess (internal use)")
scanCmd.Flags().StringVarP(&targets, "targets", "t", "", "comma-separated list of libraryID:folderPath pairs (e.g., \"1:Music/Rock,1:Music/Jazz,2:Classical\")")
rootCmd.AddCommand(scanCmd)
}
@ -68,7 +72,18 @@ func runScanner(ctx context.Context) {
ds := persistence.New(sqlDB)
pls := core.NewPlaylists(ds)
progress, err := scanner.CallScan(ctx, ds, pls, fullScan)
// Parse targets if provided
var scanTargets []model.ScanTarget
if targets != "" {
var err error
scanTargets, err = model.ParseTargets(strings.Split(targets, ","))
if err != nil {
log.Fatal(ctx, "Failed to parse targets", err)
}
log.Info(ctx, "Scanning specific folders", "numTargets", len(scanTargets))
}
progress, err := scanner.CallScan(ctx, ds, pls, fullScan, scanTargets)
if err != nil {
log.Fatal(ctx, "Failed to scan", err)
}

View File

@ -69,9 +69,9 @@ func CreateNativeAPIRouter(ctx context.Context) *nativeapi.Router {
artworkArtwork := artwork.NewArtwork(dataStore, fileCache, fFmpeg, provider)
cacheWarmer := artwork.NewCacheWarmer(artworkArtwork, fileCache)
broker := events.GetBroker()
scannerScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
watcher := scanner.GetWatcher(dataStore, scannerScanner)
library := core.NewLibrary(dataStore, scannerScanner, watcher, broker)
modelScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
watcher := scanner.GetWatcher(dataStore, modelScanner)
library := core.NewLibrary(dataStore, modelScanner, watcher, broker)
maintenance := core.NewMaintenance(dataStore)
router := nativeapi.New(dataStore, share, playlists, insights, library, maintenance)
return router
@ -95,10 +95,10 @@ func CreateSubsonicAPIRouter(ctx context.Context) *subsonic.Router {
cacheWarmer := artwork.NewCacheWarmer(artworkArtwork, fileCache)
broker := events.GetBroker()
playlists := core.NewPlaylists(dataStore)
scannerScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
modelScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
playTracker := scrobbler.GetPlayTracker(dataStore, broker, manager)
playbackServer := playback.GetInstance(dataStore)
router := subsonic.New(dataStore, artworkArtwork, mediaStreamer, archiver, players, provider, scannerScanner, broker, playlists, playTracker, share, playbackServer, metricsMetrics)
router := subsonic.New(dataStore, artworkArtwork, mediaStreamer, archiver, players, provider, modelScanner, broker, playlists, playTracker, share, playbackServer, metricsMetrics)
return router
}
@ -150,7 +150,7 @@ func CreatePrometheus() metrics.Metrics {
return metricsMetrics
}
func CreateScanner(ctx context.Context) scanner.Scanner {
func CreateScanner(ctx context.Context) model.Scanner {
sqlDB := db.Db()
dataStore := persistence.New(sqlDB)
fileCache := artwork.GetImageCache()
@ -163,8 +163,8 @@ func CreateScanner(ctx context.Context) scanner.Scanner {
cacheWarmer := artwork.NewCacheWarmer(artworkArtwork, fileCache)
broker := events.GetBroker()
playlists := core.NewPlaylists(dataStore)
scannerScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
return scannerScanner
modelScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
return modelScanner
}
func CreateScanWatcher(ctx context.Context) scanner.Watcher {
@ -180,8 +180,8 @@ func CreateScanWatcher(ctx context.Context) scanner.Watcher {
cacheWarmer := artwork.NewCacheWarmer(artworkArtwork, fileCache)
broker := events.GetBroker()
playlists := core.NewPlaylists(dataStore)
scannerScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
watcher := scanner.GetWatcher(dataStore, scannerScanner)
modelScanner := scanner.New(ctx, dataStore, cacheWarmer, broker, playlists, metricsMetrics)
watcher := scanner.GetWatcher(dataStore, modelScanner)
return watcher
}
@ -202,7 +202,7 @@ func getPluginManager() plugins.Manager {
// wire_injectors.go:
var allProviders = wire.NewSet(core.Set, artwork.Set, server.New, subsonic.New, nativeapi.New, public.New, persistence.New, lastfm.NewRouter, listenbrainz.NewRouter, events.GetBroker, scanner.New, scanner.GetWatcher, plugins.GetManager, metrics.GetPrometheusInstance, db.Db, wire.Bind(new(agents.PluginLoader), new(plugins.Manager)), wire.Bind(new(scrobbler.PluginLoader), new(plugins.Manager)), wire.Bind(new(metrics.PluginLoader), new(plugins.Manager)), wire.Bind(new(core.Scanner), new(scanner.Scanner)), wire.Bind(new(core.Watcher), new(scanner.Watcher)))
var allProviders = wire.NewSet(core.Set, artwork.Set, server.New, subsonic.New, nativeapi.New, public.New, persistence.New, lastfm.NewRouter, listenbrainz.NewRouter, events.GetBroker, scanner.New, scanner.GetWatcher, plugins.GetManager, metrics.GetPrometheusInstance, db.Db, wire.Bind(new(agents.PluginLoader), new(plugins.Manager)), wire.Bind(new(scrobbler.PluginLoader), new(plugins.Manager)), wire.Bind(new(metrics.PluginLoader), new(plugins.Manager)), wire.Bind(new(core.Watcher), new(scanner.Watcher)))
func GetPluginManager(ctx context.Context) plugins.Manager {
manager := getPluginManager()

View File

@ -45,7 +45,6 @@ var allProviders = wire.NewSet(
wire.Bind(new(agents.PluginLoader), new(plugins.Manager)),
wire.Bind(new(scrobbler.PluginLoader), new(plugins.Manager)),
wire.Bind(new(metrics.PluginLoader), new(plugins.Manager)),
wire.Bind(new(core.Scanner), new(scanner.Scanner)),
wire.Bind(new(core.Watcher), new(scanner.Watcher)),
)
@ -103,7 +102,7 @@ func CreatePrometheus() metrics.Metrics {
))
}
func CreateScanner(ctx context.Context) scanner.Scanner {
func CreateScanner(ctx context.Context) model.Scanner {
panic(wire.Build(
allProviders,
))

View File

@ -125,6 +125,7 @@ type configOptions struct {
DevAlbumInfoTimeToLive time.Duration
DevExternalScanner bool
DevScannerThreads uint
DevSelectiveWatcher bool
DevInsightsInitialDelay time.Duration
DevEnablePlayerInsights bool
DevEnablePluginsInsights bool
@ -600,6 +601,7 @@ func setViperDefaults() {
viper.SetDefault("devalbuminfotimetolive", consts.AlbumInfoTimeToLive)
viper.SetDefault("devexternalscanner", true)
viper.SetDefault("devscannerthreads", 5)
viper.SetDefault("devselectivewatcher", true)
viper.SetDefault("devinsightsinitialdelay", consts.InsightsInitialDelay)
viper.SetDefault("devenableplayerinsights", true)
viper.SetDefault("devenablepluginsinsights", true)

View File

@ -21,11 +21,6 @@ import (
"github.com/navidrome/navidrome/utils/slice"
)
// Scanner interface for triggering scans
type Scanner interface {
ScanAll(ctx context.Context, fullScan bool) (warnings []string, err error)
}
// Watcher interface for managing file system watchers
type Watcher interface {
Watch(ctx context.Context, lib *model.Library) error
@ -43,13 +38,13 @@ type Library interface {
type libraryService struct {
ds model.DataStore
scanner Scanner
scanner model.Scanner
watcher Watcher
broker events.Broker
}
// NewLibrary creates a new Library service
func NewLibrary(ds model.DataStore, scanner Scanner, watcher Watcher, broker events.Broker) Library {
func NewLibrary(ds model.DataStore, scanner model.Scanner, watcher Watcher, broker events.Broker) Library {
return &libraryService{
ds: ds,
scanner: scanner,
@ -155,7 +150,7 @@ type libraryRepositoryWrapper struct {
model.LibraryRepository
ctx context.Context
ds model.DataStore
scanner Scanner
scanner model.Scanner
watcher Watcher
broker events.Broker
}
@ -192,7 +187,7 @@ func (r *libraryRepositoryWrapper) Save(entity interface{}) (string, error) {
return strconv.Itoa(lib.ID), nil
}
func (r *libraryRepositoryWrapper) Update(id string, entity interface{}, cols ...string) error {
func (r *libraryRepositoryWrapper) Update(id string, entity interface{}, _ ...string) error {
lib := entity.(*model.Library)
libID, err := strconv.Atoi(id)
if err != nil {

View File

@ -29,7 +29,7 @@ var _ = Describe("Library Service", func() {
var userRepo *tests.MockedUserRepo
var ctx context.Context
var tempDir string
var scanner *mockScanner
var scanner *tests.MockScanner
var watcherManager *mockWatcherManager
var broker *mockEventBroker
@ -43,7 +43,7 @@ var _ = Describe("Library Service", func() {
ds.MockedUser = userRepo
// Create a mock scanner that tracks calls
scanner = &mockScanner{}
scanner = tests.NewMockScanner()
// Create a mock watcher manager
watcherManager = &mockWatcherManager{
libraryStates: make(map[int]model.Library),
@ -616,11 +616,12 @@ var _ = Describe("Library Service", func() {
// Wait briefly for the goroutine to complete
Eventually(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "1s", "10ms").Should(Equal(1))
// Verify scan was called with correct parameters
Expect(scanner.ScanCalls[0].FullScan).To(BeFalse()) // Should be quick scan
calls := scanner.GetScanAllCalls()
Expect(calls[0].FullScan).To(BeFalse()) // Should be quick scan
})
It("triggers scan when updating library path", func() {
@ -641,11 +642,12 @@ var _ = Describe("Library Service", func() {
// Wait briefly for the goroutine to complete
Eventually(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "1s", "10ms").Should(Equal(1))
// Verify scan was called with correct parameters
Expect(scanner.ScanCalls[0].FullScan).To(BeFalse()) // Should be quick scan
calls := scanner.GetScanAllCalls()
Expect(calls[0].FullScan).To(BeFalse()) // Should be quick scan
})
It("does not trigger scan when updating library without path change", func() {
@ -661,7 +663,7 @@ var _ = Describe("Library Service", func() {
// Wait a bit to ensure no scan was triggered
Consistently(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "100ms", "10ms").Should(Equal(0))
})
@ -674,7 +676,7 @@ var _ = Describe("Library Service", func() {
// Ensure no scan was triggered since creation failed
Consistently(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "100ms", "10ms").Should(Equal(0))
})
@ -691,7 +693,7 @@ var _ = Describe("Library Service", func() {
// Ensure no scan was triggered since update failed
Consistently(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "100ms", "10ms").Should(Equal(0))
})
@ -707,11 +709,12 @@ var _ = Describe("Library Service", func() {
// Wait briefly for the goroutine to complete
Eventually(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "1s", "10ms").Should(Equal(1))
// Verify scan was called with correct parameters
Expect(scanner.ScanCalls[0].FullScan).To(BeFalse()) // Should be quick scan
calls := scanner.GetScanAllCalls()
Expect(calls[0].FullScan).To(BeFalse()) // Should be quick scan
})
It("does not trigger scan when library deletion fails", func() {
@ -721,7 +724,7 @@ var _ = Describe("Library Service", func() {
// Ensure no scan was triggered since deletion failed
Consistently(func() int {
return scanner.len()
return scanner.GetScanAllCallCount()
}, "100ms", "10ms").Should(Equal(0))
})
@ -868,31 +871,6 @@ var _ = Describe("Library Service", func() {
})
})
// mockScanner provides a simple mock implementation of core.Scanner for testing
type mockScanner struct {
ScanCalls []ScanCall
mu sync.RWMutex
}
type ScanCall struct {
FullScan bool
}
func (m *mockScanner) ScanAll(ctx context.Context, fullScan bool) (warnings []string, err error) {
m.mu.Lock()
defer m.mu.Unlock()
m.ScanCalls = append(m.ScanCalls, ScanCall{
FullScan: fullScan,
})
return []string{}, nil
}
func (m *mockScanner) len() int {
m.mu.RLock()
defer m.mu.RUnlock()
return len(m.ScanCalls)
}
// mockWatcherManager provides a simple mock implementation of core.Watcher for testing
type mockWatcherManager struct {
StartedWatchers []model.Library

View File

@ -14,7 +14,7 @@ import (
)
var _ = Describe("Maintenance", func() {
var ds *extendedDataStore
var ds *tests.MockDataStore
var mfRepo *extendedMediaFileRepo
var service Maintenance
var ctx context.Context
@ -42,7 +42,7 @@ var _ = Describe("Maintenance", func() {
Expect(err).ToNot(HaveOccurred())
Expect(mfRepo.deleteMissingCalled).To(BeTrue())
Expect(mfRepo.deletedIDs).To(Equal([]string{"mf1", "mf2"}))
Expect(ds.gcCalled).To(BeTrue(), "GC should be called after deletion")
Expect(ds.GCCalled).To(BeTrue(), "GC should be called after deletion")
})
It("triggers artist stats refresh and album refresh after deletion", func() {
@ -97,7 +97,7 @@ var _ = Describe("Maintenance", func() {
})
// Set GC to return error
ds.gcError = errors.New("gc failed")
ds.GCError = errors.New("gc failed")
err := service.DeleteMissingFiles(ctx, []string{"mf1"})
@ -143,7 +143,7 @@ var _ = Describe("Maintenance", func() {
err := service.DeleteAllMissingFiles(ctx)
Expect(err).ToNot(HaveOccurred())
Expect(ds.gcCalled).To(BeTrue(), "GC should be called after deletion")
Expect(ds.GCCalled).To(BeTrue(), "GC should be called after deletion")
})
It("returns error if deletion fails", func() {
@ -253,11 +253,8 @@ var _ = Describe("Maintenance", func() {
})
// Test helper to create a mock DataStore with controllable behavior
func createTestDataStore() *extendedDataStore {
// Create extended datastore with GC tracking
ds := &extendedDataStore{
MockDataStore: &tests.MockDataStore{},
}
func createTestDataStore() *tests.MockDataStore {
ds := &tests.MockDataStore{}
// Create extended album repo with Put tracking
albumRepo := &extendedAlbumRepo{
@ -365,18 +362,3 @@ func (m *extendedArtistRepo) IsRefreshStatsCalled() bool {
defer m.mu.RUnlock()
return m.refreshStatsCalled
}
// Extension of MockDataStore to track GC calls
type extendedDataStore struct {
*tests.MockDataStore
gcCalled bool
gcError error
}
func (ds *extendedDataStore) GC(ctx context.Context) error {
ds.gcCalled = true
if ds.gcError != nil {
return ds.gcError
}
return ds.MockDataStore.GC(ctx)
}

View File

@ -80,8 +80,8 @@ var (
// SetLevel sets the global log level used by the simple logger.
func SetLevel(l Level) {
currentLevel = l
loggerMu.Lock()
currentLevel = l
defaultLogger.Level = logrus.TraceLevel
loggerMu.Unlock()
logrus.SetLevel(logrus.Level(l))
@ -114,6 +114,8 @@ func levelFromString(l string) Level {
// SetLogLevels sets the log levels for specific paths in the codebase.
func SetLogLevels(levels map[string]string) {
loggerMu.Lock()
defer loggerMu.Unlock()
logLevels = nil
for k, v := range levels {
logLevels = append(logLevels, levelPath{path: k, level: levelFromString(v)})
@ -172,6 +174,8 @@ func SetDefaultLogger(l *logrus.Logger) {
}
func CurrentLevel() Level {
loggerMu.RLock()
defer loggerMu.RUnlock()
return currentLevel
}
@ -220,10 +224,15 @@ func Writer() io.Writer {
}
func shouldLog(requiredLevel Level, skip int) bool {
if currentLevel >= requiredLevel {
loggerMu.RLock()
level := currentLevel
levels := logLevels
loggerMu.RUnlock()
if level >= requiredLevel {
return true
}
if len(logLevels) == 0 {
if len(levels) == 0 {
return false
}
@ -233,7 +242,7 @@ func shouldLog(requiredLevel Level, skip int) bool {
}
file = strings.TrimPrefix(file, rootPath)
for _, lp := range logLevels {
for _, lp := range levels {
if strings.HasPrefix(file, lp.path) {
return lp.level >= requiredLevel
}

View File

@ -43,5 +43,5 @@ type DataStore interface {
WithTx(block func(tx DataStore) error, scope ...string) error
WithTxImmediate(block func(tx DataStore) error, scope ...string) error
GC(ctx context.Context) error
GC(ctx context.Context, libraryIDs ...int) error
}

View File

@ -85,7 +85,7 @@ type FolderRepository interface {
GetByPath(lib Library, path string) (*Folder, error)
GetAll(...QueryOptions) ([]Folder, error)
CountAll(...QueryOptions) (int64, error)
GetLastUpdates(lib Library) (map[string]FolderUpdateInfo, error)
GetFolderUpdateInfo(lib Library, targetPaths ...string) (map[string]FolderUpdateInfo, error)
Put(*Folder) error
MarkMissing(missing bool, ids ...string) error
GetTouchedWithPlaylists() (FolderCursor, error)

81
model/scanner.go Normal file
View File

@ -0,0 +1,81 @@
package model
import (
"context"
"fmt"
"strconv"
"strings"
"time"
)
// ScanTarget represents a specific folder within a library to be scanned.
// NOTE: This struct is used as a map key, so it should only contain comparable types.
type ScanTarget struct {
LibraryID int
FolderPath string // Relative path within the library, or "" for entire library
}
func (st ScanTarget) String() string {
return fmt.Sprintf("%d:%s", st.LibraryID, st.FolderPath)
}
// ScannerStatus holds information about the current scan status
type ScannerStatus struct {
Scanning bool
LastScan time.Time
Count uint32
FolderCount uint32
LastError string
ScanType string
ElapsedTime time.Duration
}
type Scanner interface {
// ScanAll starts a scan of all libraries. This is a blocking operation.
ScanAll(ctx context.Context, fullScan bool) (warnings []string, err error)
// ScanFolders scans specific library/folder pairs, recursing into subdirectories.
// If targets is nil, it scans all libraries. This is a blocking operation.
ScanFolders(ctx context.Context, fullScan bool, targets []ScanTarget) (warnings []string, err error)
Status(context.Context) (*ScannerStatus, error)
}
// ParseTargets parses scan targets strings into ScanTarget structs.
// Example: []string{"1:Music/Rock", "2:Classical"}
func ParseTargets(libFolders []string) ([]ScanTarget, error) {
targets := make([]ScanTarget, 0, len(libFolders))
for _, part := range libFolders {
part = strings.TrimSpace(part)
if part == "" {
continue
}
// Split by the first colon
colonIdx := strings.Index(part, ":")
if colonIdx == -1 {
return nil, fmt.Errorf("invalid target format: %q (expected libraryID:folderPath)", part)
}
libIDStr := part[:colonIdx]
folderPath := part[colonIdx+1:]
libID, err := strconv.Atoi(libIDStr)
if err != nil {
return nil, fmt.Errorf("invalid library ID %q: %w", libIDStr, err)
}
if libID <= 0 {
return nil, fmt.Errorf("invalid library ID %q", libIDStr)
}
targets = append(targets, ScanTarget{
LibraryID: libID,
FolderPath: folderPath,
})
}
if len(targets) == 0 {
return nil, fmt.Errorf("no valid targets found")
}
return targets, nil
}

89
model/scanner_test.go Normal file
View File

@ -0,0 +1,89 @@
package model_test
import (
"github.com/navidrome/navidrome/model"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("ParseTargets", func() {
It("parses multiple entries in slice", func() {
targets, err := model.ParseTargets([]string{"1:Music/Rock", "1:Music/Jazz", "2:Classical"})
Expect(err).ToNot(HaveOccurred())
Expect(targets).To(HaveLen(3))
Expect(targets[0].LibraryID).To(Equal(1))
Expect(targets[0].FolderPath).To(Equal("Music/Rock"))
Expect(targets[1].LibraryID).To(Equal(1))
Expect(targets[1].FolderPath).To(Equal("Music/Jazz"))
Expect(targets[2].LibraryID).To(Equal(2))
Expect(targets[2].FolderPath).To(Equal("Classical"))
})
It("handles empty folder paths", func() {
targets, err := model.ParseTargets([]string{"1:", "2:"})
Expect(err).ToNot(HaveOccurred())
Expect(targets).To(HaveLen(2))
Expect(targets[0].FolderPath).To(Equal(""))
Expect(targets[1].FolderPath).To(Equal(""))
})
It("trims whitespace from entries", func() {
targets, err := model.ParseTargets([]string{" 1:Music/Rock", " 2:Classical "})
Expect(err).ToNot(HaveOccurred())
Expect(targets).To(HaveLen(2))
Expect(targets[0].LibraryID).To(Equal(1))
Expect(targets[0].FolderPath).To(Equal("Music/Rock"))
Expect(targets[1].LibraryID).To(Equal(2))
Expect(targets[1].FolderPath).To(Equal("Classical"))
})
It("skips empty strings", func() {
targets, err := model.ParseTargets([]string{"1:Music/Rock", "", "2:Classical"})
Expect(err).ToNot(HaveOccurred())
Expect(targets).To(HaveLen(2))
})
It("handles paths with colons", func() {
targets, err := model.ParseTargets([]string{"1:C:/Music/Rock", "2:/path:with:colons"})
Expect(err).ToNot(HaveOccurred())
Expect(targets).To(HaveLen(2))
Expect(targets[0].FolderPath).To(Equal("C:/Music/Rock"))
Expect(targets[1].FolderPath).To(Equal("/path:with:colons"))
})
It("returns error for invalid format without colon", func() {
_, err := model.ParseTargets([]string{"1Music/Rock"})
Expect(err).To(HaveOccurred())
Expect(err.Error()).To(ContainSubstring("invalid target format"))
})
It("returns error for non-numeric library ID", func() {
_, err := model.ParseTargets([]string{"abc:Music/Rock"})
Expect(err).To(HaveOccurred())
Expect(err.Error()).To(ContainSubstring("invalid library ID"))
})
It("returns error for negative library ID", func() {
_, err := model.ParseTargets([]string{"-1:Music/Rock"})
Expect(err).To(HaveOccurred())
Expect(err.Error()).To(ContainSubstring("invalid library ID"))
})
It("returns error for zero library ID", func() {
_, err := model.ParseTargets([]string{"0:Music/Rock"})
Expect(err).To(HaveOccurred())
Expect(err.Error()).To(ContainSubstring("invalid library ID"))
})
It("returns error for empty input", func() {
_, err := model.ParseTargets([]string{})
Expect(err).To(HaveOccurred())
Expect(err.Error()).To(ContainSubstring("no valid targets found"))
})
It("returns error for all empty strings", func() {
_, err := model.ParseTargets([]string{"", " ", ""})
Expect(err).To(HaveOccurred())
Expect(err.Error()).To(ContainSubstring("no valid targets found"))
})
})

View File

@ -337,8 +337,12 @@ on conflict (user_id, item_id, item_type) do update
return r.executeSQL(query)
}
func (r *albumRepository) purgeEmpty() error {
func (r *albumRepository) purgeEmpty(libraryIDs ...int) error {
del := Delete(r.tableName).Where("id not in (select distinct(album_id) from media_file)")
// If libraryIDs are specified, only purge albums from those libraries
if len(libraryIDs) > 0 {
del = del.Where(Eq{"library_id": libraryIDs})
}
c, err := r.executeSQL(del)
if err != nil {
return fmt.Errorf("purging empty albums: %w", err)

View File

@ -4,7 +4,10 @@ import (
"context"
"encoding/json"
"fmt"
"os"
"path/filepath"
"slices"
"strings"
"time"
. "github.com/Masterminds/squirrel"
@ -91,8 +94,47 @@ func (r folderRepository) CountAll(opt ...model.QueryOptions) (int64, error) {
return r.count(query)
}
func (r folderRepository) GetLastUpdates(lib model.Library) (map[string]model.FolderUpdateInfo, error) {
sq := r.newSelect().Columns("id", "updated_at", "hash").Where(Eq{"library_id": lib.ID, "missing": false})
func (r folderRepository) GetFolderUpdateInfo(lib model.Library, targetPaths ...string) (map[string]model.FolderUpdateInfo, error) {
where := And{
Eq{"library_id": lib.ID},
Eq{"missing": false},
}
// If specific paths are requested, include those folders and all their descendants
if len(targetPaths) > 0 {
// Collect folder IDs for exact target folders and path conditions for descendants
folderIDs := make([]string, 0, len(targetPaths))
pathConditions := make(Or, 0, len(targetPaths)*2)
for _, targetPath := range targetPaths {
if targetPath == "" || targetPath == "." {
// Root path - include everything in this library
pathConditions = Or{}
folderIDs = nil
break
}
// Clean the path to normalize it. Paths stored in the folder table do not have leading/trailing slashes.
cleanPath := strings.TrimPrefix(targetPath, string(os.PathSeparator))
cleanPath = filepath.Clean(cleanPath)
// Include the target folder itself by ID
folderIDs = append(folderIDs, model.FolderID(lib, cleanPath))
// Include all descendants: folders whose path field equals or starts with the target path
// Note: Folder.Path is the directory path, so children have path = targetPath
pathConditions = append(pathConditions, Eq{"path": cleanPath})
pathConditions = append(pathConditions, Like{"path": cleanPath + "/%"})
}
// Combine conditions: exact folder IDs OR descendant path patterns
if len(folderIDs) > 0 {
where = append(where, Or{Eq{"id": folderIDs}, pathConditions})
} else if len(pathConditions) > 0 {
where = append(where, pathConditions)
}
}
sq := r.newSelect().Columns("id", "updated_at", "hash").Where(where)
var res []struct {
ID string
UpdatedAt time.Time
@ -149,7 +191,7 @@ func (r folderRepository) GetTouchedWithPlaylists() (model.FolderCursor, error)
}, nil
}
func (r folderRepository) purgeEmpty() error {
func (r folderRepository) purgeEmpty(libraryIDs ...int) error {
sq := Delete(r.tableName).Where(And{
Eq{"num_audio_files": 0},
Eq{"num_playlists": 0},
@ -157,6 +199,10 @@ func (r folderRepository) purgeEmpty() error {
ConcatExpr("id not in (select parent_id from folder)"),
ConcatExpr("id not in (select folder_id from media_file)"),
})
// If libraryIDs are specified, only purge folders from those libraries
if len(libraryIDs) > 0 {
sq = sq.Where(Eq{"library_id": libraryIDs})
}
c, err := r.executeSQL(sq)
if err != nil {
return fmt.Errorf("purging empty folders: %w", err)

View File

@ -0,0 +1,213 @@
package persistence
import (
"context"
"fmt"
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/model/request"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
"github.com/pocketbase/dbx"
)
var _ = Describe("FolderRepository", func() {
var repo model.FolderRepository
var ctx context.Context
var conn *dbx.DB
var testLib, otherLib model.Library
BeforeEach(func() {
ctx = request.WithUser(log.NewContext(context.TODO()), model.User{ID: "userid"})
conn = GetDBXBuilder()
repo = newFolderRepository(ctx, conn)
// Use existing library ID 1 from test fixtures
libRepo := NewLibraryRepository(ctx, conn)
lib, err := libRepo.Get(1)
Expect(err).ToNot(HaveOccurred())
testLib = *lib
// Create a second library with its own folder to verify isolation
otherLib = model.Library{Name: "Other Library", Path: "/other/path"}
Expect(libRepo.Put(&otherLib)).To(Succeed())
})
AfterEach(func() {
// Clean up only test folders created by our tests (paths starting with "Test")
// This prevents interference with fixture data needed by other tests
_, _ = conn.NewQuery("DELETE FROM folder WHERE library_id = 1 AND path LIKE 'Test%'").Execute()
_, _ = conn.NewQuery(fmt.Sprintf("DELETE FROM library WHERE id = %d", otherLib.ID)).Execute()
})
Describe("GetFolderUpdateInfo", func() {
Context("with no target paths", func() {
It("returns all folders in the library", func() {
// Create test folders with unique names to avoid conflicts
folder1 := model.NewFolder(testLib, "TestGetLastUpdates/Folder1")
folder2 := model.NewFolder(testLib, "TestGetLastUpdates/Folder2")
err := repo.Put(folder1)
Expect(err).ToNot(HaveOccurred())
err = repo.Put(folder2)
Expect(err).ToNot(HaveOccurred())
otherFolder := model.NewFolder(otherLib, "TestOtherLib/Folder")
err = repo.Put(otherFolder)
Expect(err).ToNot(HaveOccurred())
// Query all folders (no target paths) - should only return folders from testLib
results, err := repo.GetFolderUpdateInfo(testLib)
Expect(err).ToNot(HaveOccurred())
// Should include folders from testLib
Expect(results).To(HaveKey(folder1.ID))
Expect(results).To(HaveKey(folder2.ID))
// Should NOT include folders from other library
Expect(results).ToNot(HaveKey(otherFolder.ID))
})
})
Context("with specific target paths", func() {
It("returns folder info for existing folders", func() {
// Create test folders with unique names
folder1 := model.NewFolder(testLib, "TestSpecific/Rock")
folder2 := model.NewFolder(testLib, "TestSpecific/Jazz")
folder3 := model.NewFolder(testLib, "TestSpecific/Classical")
err := repo.Put(folder1)
Expect(err).ToNot(HaveOccurred())
err = repo.Put(folder2)
Expect(err).ToNot(HaveOccurred())
err = repo.Put(folder3)
Expect(err).ToNot(HaveOccurred())
// Query specific paths
results, err := repo.GetFolderUpdateInfo(testLib, "TestSpecific/Rock", "TestSpecific/Classical")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(HaveLen(2))
// Verify folder IDs are in results
Expect(results).To(HaveKey(folder1.ID))
Expect(results).To(HaveKey(folder3.ID))
Expect(results).ToNot(HaveKey(folder2.ID))
// Verify update info is populated
Expect(results[folder1.ID].UpdatedAt).ToNot(BeZero())
Expect(results[folder1.ID].Hash).To(Equal(folder1.Hash))
})
It("includes all child folders when querying parent", func() {
// Create a parent folder with multiple children
parent := model.NewFolder(testLib, "TestParent/Music")
child1 := model.NewFolder(testLib, "TestParent/Music/Rock/Queen")
child2 := model.NewFolder(testLib, "TestParent/Music/Jazz")
otherParent := model.NewFolder(testLib, "TestParent2/Music/Jazz")
Expect(repo.Put(parent)).To(Succeed())
Expect(repo.Put(child1)).To(Succeed())
Expect(repo.Put(child2)).To(Succeed())
// Query the parent folder - should return parent and all children
results, err := repo.GetFolderUpdateInfo(testLib, "TestParent/Music")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(HaveLen(3))
Expect(results).To(HaveKey(parent.ID))
Expect(results).To(HaveKey(child1.ID))
Expect(results).To(HaveKey(child2.ID))
Expect(results).ToNot(HaveKey(otherParent.ID))
})
It("excludes children from other libraries", func() {
// Create parent in testLib
parent := model.NewFolder(testLib, "TestIsolation/Parent")
child := model.NewFolder(testLib, "TestIsolation/Parent/Child")
Expect(repo.Put(parent)).To(Succeed())
Expect(repo.Put(child)).To(Succeed())
// Create similar path in other library
otherParent := model.NewFolder(otherLib, "TestIsolation/Parent")
otherChild := model.NewFolder(otherLib, "TestIsolation/Parent/Child")
Expect(repo.Put(otherParent)).To(Succeed())
Expect(repo.Put(otherChild)).To(Succeed())
// Query should only return folders from testLib
results, err := repo.GetFolderUpdateInfo(testLib, "TestIsolation/Parent")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(HaveLen(2))
Expect(results).To(HaveKey(parent.ID))
Expect(results).To(HaveKey(child.ID))
Expect(results).ToNot(HaveKey(otherParent.ID))
Expect(results).ToNot(HaveKey(otherChild.ID))
})
It("excludes missing children when querying parent", func() {
// Create parent and children, mark one as missing
parent := model.NewFolder(testLib, "TestMissingChild/Parent")
child1 := model.NewFolder(testLib, "TestMissingChild/Parent/Child1")
child2 := model.NewFolder(testLib, "TestMissingChild/Parent/Child2")
child2.Missing = true
Expect(repo.Put(parent)).To(Succeed())
Expect(repo.Put(child1)).To(Succeed())
Expect(repo.Put(child2)).To(Succeed())
// Query parent - should only return parent and non-missing child
results, err := repo.GetFolderUpdateInfo(testLib, "TestMissingChild/Parent")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(HaveLen(2))
Expect(results).To(HaveKey(parent.ID))
Expect(results).To(HaveKey(child1.ID))
Expect(results).ToNot(HaveKey(child2.ID))
})
It("handles mix of existing and non-existing target paths", func() {
// Create folders for one path but not the other
existingParent := model.NewFolder(testLib, "TestMixed/Exists")
existingChild := model.NewFolder(testLib, "TestMixed/Exists/Child")
Expect(repo.Put(existingParent)).To(Succeed())
Expect(repo.Put(existingChild)).To(Succeed())
// Query both existing and non-existing paths
results, err := repo.GetFolderUpdateInfo(testLib, "TestMixed/Exists", "TestMixed/DoesNotExist")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(HaveLen(2))
Expect(results).To(HaveKey(existingParent.ID))
Expect(results).To(HaveKey(existingChild.ID))
})
It("handles empty folder path as root", func() {
// Test querying for root folder without creating it (fixtures should have one)
rootFolderID := model.FolderID(testLib, ".")
results, err := repo.GetFolderUpdateInfo(testLib, "")
Expect(err).ToNot(HaveOccurred())
// Should return the root folder if it exists
if len(results) > 0 {
Expect(results).To(HaveKey(rootFolderID))
}
})
It("returns empty map for non-existent folders", func() {
results, err := repo.GetFolderUpdateInfo(testLib, "NonExistent/Path")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(BeEmpty())
})
It("skips missing folders", func() {
// Create a folder and mark it as missing
folder := model.NewFolder(testLib, "TestMissing/Folder")
folder.Missing = true
err := repo.Put(folder)
Expect(err).ToNot(HaveOccurred())
results, err := repo.GetFolderUpdateInfo(testLib, "TestMissing/Folder")
Expect(err).ToNot(HaveOccurred())
Expect(results).To(BeEmpty())
})
})
})
})

View File

@ -177,7 +177,9 @@ func (r *libraryRepository) ScanEnd(id int) error {
return err
}
// https://www.sqlite.org/pragma.html#pragma_optimize
_, err = r.executeSQL(Expr("PRAGMA optimize=0x10012;"))
// Use mask 0x10000 to check table sizes without running ANALYZE
// Running ANALYZE can cause query planner issues with expression-based collation indexes
_, err = r.executeSQL(Expr("PRAGMA optimize=0x10000;"))
return err
}

View File

@ -142,4 +142,62 @@ var _ = Describe("LibraryRepository", func() {
Expect(libAfter.TotalSize).To(Equal(sizeRes.Sum))
Expect(libAfter.TotalDuration).To(Equal(durationRes.Sum))
})
Describe("ScanBegin and ScanEnd", func() {
var lib *model.Library
BeforeEach(func() {
lib = &model.Library{
ID: 0,
Name: "Test Scan Library",
Path: "/music/test-scan",
}
err := repo.Put(lib)
Expect(err).ToNot(HaveOccurred())
})
DescribeTable("ScanBegin",
func(fullScan bool, expectedFullScanInProgress bool) {
err := repo.ScanBegin(lib.ID, fullScan)
Expect(err).ToNot(HaveOccurred())
updatedLib, err := repo.Get(lib.ID)
Expect(err).ToNot(HaveOccurred())
Expect(updatedLib.LastScanStartedAt).ToNot(BeZero())
Expect(updatedLib.FullScanInProgress).To(Equal(expectedFullScanInProgress))
},
Entry("sets FullScanInProgress to true for full scan", true, true),
Entry("sets FullScanInProgress to false for quick scan", false, false),
)
Context("ScanEnd", func() {
BeforeEach(func() {
err := repo.ScanBegin(lib.ID, true)
Expect(err).ToNot(HaveOccurred())
})
It("sets LastScanAt and clears FullScanInProgress and LastScanStartedAt", func() {
err := repo.ScanEnd(lib.ID)
Expect(err).ToNot(HaveOccurred())
updatedLib, err := repo.Get(lib.ID)
Expect(err).ToNot(HaveOccurred())
Expect(updatedLib.LastScanAt).ToNot(BeZero())
Expect(updatedLib.FullScanInProgress).To(BeFalse())
Expect(updatedLib.LastScanStartedAt).To(BeZero())
})
It("sets LastScanAt to be after LastScanStartedAt", func() {
libBefore, err := repo.Get(lib.ID)
Expect(err).ToNot(HaveOccurred())
err = repo.ScanEnd(lib.ID)
Expect(err).ToNot(HaveOccurred())
libAfter, err := repo.Get(lib.ID)
Expect(err).ToNot(HaveOccurred())
Expect(libAfter.LastScanAt).To(BeTemporally(">=", libBefore.LastScanStartedAt))
})
})
})
})

View File

@ -157,7 +157,7 @@ func (s *SQLStore) WithTxImmediate(block func(tx model.DataStore) error, scope .
}, scope...)
}
func (s *SQLStore) GC(ctx context.Context) error {
func (s *SQLStore) GC(ctx context.Context, libraryIDs ...int) error {
trace := func(ctx context.Context, msg string, f func() error) func() error {
return func() error {
start := time.Now()
@ -167,11 +167,17 @@ func (s *SQLStore) GC(ctx context.Context) error {
}
}
// If libraryIDs are provided, scope operations to those libraries where possible
scoped := len(libraryIDs) > 0
if scoped {
log.Debug(ctx, "GC: Running selective garbage collection", "libraryIDs", libraryIDs)
}
err := run.Sequentially(
trace(ctx, "purge empty albums", func() error { return s.Album(ctx).(*albumRepository).purgeEmpty() }),
trace(ctx, "purge empty albums", func() error { return s.Album(ctx).(*albumRepository).purgeEmpty(libraryIDs...) }),
trace(ctx, "purge empty artists", func() error { return s.Artist(ctx).(*artistRepository).purgeEmpty() }),
trace(ctx, "mark missing artists", func() error { return s.Artist(ctx).(*artistRepository).markMissing() }),
trace(ctx, "purge empty folders", func() error { return s.Folder(ctx).(*folderRepository).purgeEmpty() }),
trace(ctx, "purge empty folders", func() error { return s.Folder(ctx).(*folderRepository).purgeEmpty(libraryIDs...) }),
trace(ctx, "clean album annotations", func() error { return s.Album(ctx).(*albumRepository).cleanAnnotations() }),
trace(ctx, "clean artist annotations", func() error { return s.Artist(ctx).(*artistRepository).cleanAnnotations() }),
trace(ctx, "clean media file annotations", func() error { return s.MediaFile(ctx).(*mediaFileRepository).cleanAnnotations() }),

View File

@ -300,6 +300,8 @@
},
"actions": {
"scan": "Scanear Biblioteca",
"quickScan": "Scan Rápido",
"fullScan": "Scan Completo",
"manageUsers": "Gerenciar Acesso do Usuário",
"viewDetails": "Ver Detalhes"
},
@ -308,6 +310,9 @@
"updated": "Biblioteca atualizada com sucesso",
"deleted": "Biblioteca excluída com sucesso",
"scanStarted": "Scan da biblioteca iniciada",
"quickScanStarted": "Scan rápido iniciado",
"fullScanStarted": "Scan completo iniciado",
"scanError": "Erro ao iniciar o scan. Verifique os logs",
"scanCompleted": "Scan da biblioteca concluída"
},
"validation": {
@ -598,11 +603,12 @@
"activity": {
"title": "Atividade",
"totalScanned": "Total de pastas scaneadas",
"quickScan": "Scan rápido",
"fullScan": "Scan completo",
"quickScan": "Rápido",
"fullScan": "Completo",
"selectiveScan": "Seletivo",
"serverUptime": "Uptime do servidor",
"serverDown": "DESCONECTADO",
"scanType": "Tipo",
"scanType": "Último Scan",
"status": "Erro",
"elapsedTime": "Duração"
},

View File

@ -26,24 +26,8 @@ var (
ErrAlreadyScanning = errors.New("already scanning")
)
type Scanner interface {
// ScanAll starts a full scan of the music library. This is a blocking operation.
ScanAll(ctx context.Context, fullScan bool) (warnings []string, err error)
Status(context.Context) (*StatusInfo, error)
}
type StatusInfo struct {
Scanning bool
LastScan time.Time
Count uint32
FolderCount uint32
LastError string
ScanType string
ElapsedTime time.Duration
}
func New(rootCtx context.Context, ds model.DataStore, cw artwork.CacheWarmer, broker events.Broker,
pls core.Playlists, m metrics.Metrics) Scanner {
pls core.Playlists, m metrics.Metrics) model.Scanner {
c := &controller{
rootCtx: rootCtx,
ds: ds,
@ -65,9 +49,10 @@ func (s *controller) getScanner() scanner {
return &scannerImpl{ds: s.ds, cw: s.cw, pls: s.pls}
}
// CallScan starts an in-process scan of the music library.
// CallScan starts an in-process scan of specific library/folder pairs.
// If targets is empty, it scans all libraries.
// This is meant to be called from the command line (see cmd/scan.go).
func CallScan(ctx context.Context, ds model.DataStore, pls core.Playlists, fullScan bool) (<-chan *ProgressInfo, error) {
func CallScan(ctx context.Context, ds model.DataStore, pls core.Playlists, fullScan bool, targets []model.ScanTarget) (<-chan *ProgressInfo, error) {
release, err := lockScan(ctx)
if err != nil {
return nil, err
@ -79,7 +64,7 @@ func CallScan(ctx context.Context, ds model.DataStore, pls core.Playlists, fullS
go func() {
defer close(progress)
scanner := &scannerImpl{ds: ds, cw: artwork.NoopCacheWarmer(), pls: pls}
scanner.scanAll(ctx, fullScan, progress)
scanner.scanFolders(ctx, fullScan, targets, progress)
}()
return progress, nil
}
@ -99,8 +84,11 @@ type ProgressInfo struct {
ForceUpdate bool
}
// scanner defines the interface for different scanner implementations.
// This allows for swapping between in-process and external scanners.
type scanner interface {
scanAll(ctx context.Context, fullScan bool, progress chan<- *ProgressInfo)
// scanFolders performs the actual scanning of folders. If targets is nil, it scans all libraries.
scanFolders(ctx context.Context, fullScan bool, targets []model.ScanTarget, progress chan<- *ProgressInfo)
}
type controller struct {
@ -158,7 +146,7 @@ func (s *controller) getScanInfo(ctx context.Context) (scanType string, elapsed
return scanType, elapsed, lastErr
}
func (s *controller) Status(ctx context.Context) (*StatusInfo, error) {
func (s *controller) Status(ctx context.Context) (*model.ScannerStatus, error) {
lastScanTime, err := s.getLastScanTime(ctx)
if err != nil {
return nil, fmt.Errorf("getting last scan time: %w", err)
@ -167,7 +155,7 @@ func (s *controller) Status(ctx context.Context) (*StatusInfo, error) {
scanType, elapsed, lastErr := s.getScanInfo(ctx)
if running.Load() {
status := &StatusInfo{
status := &model.ScannerStatus{
Scanning: true,
LastScan: lastScanTime,
Count: s.count.Load(),
@ -183,7 +171,7 @@ func (s *controller) Status(ctx context.Context) (*StatusInfo, error) {
if err != nil {
return nil, fmt.Errorf("getting library stats: %w", err)
}
return &StatusInfo{
return &model.ScannerStatus{
Scanning: false,
LastScan: lastScanTime,
Count: uint32(count),
@ -208,6 +196,10 @@ func (s *controller) getCounters(ctx context.Context) (int64, int64, error) {
}
func (s *controller) ScanAll(requestCtx context.Context, fullScan bool) ([]string, error) {
return s.ScanFolders(requestCtx, fullScan, nil)
}
func (s *controller) ScanFolders(requestCtx context.Context, fullScan bool, targets []model.ScanTarget) ([]string, error) {
release, err := lockScan(requestCtx)
if err != nil {
return nil, err
@ -224,7 +216,7 @@ func (s *controller) ScanAll(requestCtx context.Context, fullScan bool) ([]strin
go func() {
defer close(progress)
scanner := s.getScanner()
scanner.scanAll(ctx, fullScan, progress)
scanner.scanFolders(ctx, fullScan, targets, progress)
}()
// Wait for the scan to finish, sending progress events to all connected clients

View File

@ -9,6 +9,7 @@ import (
"github.com/navidrome/navidrome/core/artwork"
"github.com/navidrome/navidrome/core/metrics"
"github.com/navidrome/navidrome/db"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/persistence"
"github.com/navidrome/navidrome/scanner"
"github.com/navidrome/navidrome/server/events"
@ -20,7 +21,7 @@ import (
var _ = Describe("Controller", func() {
var ctx context.Context
var ds *tests.MockDataStore
var ctrl scanner.Scanner
var ctrl model.Scanner
Describe("Status", func() {
BeforeEach(func() {

View File

@ -8,10 +8,12 @@ import (
"io"
"os"
"os/exec"
"strings"
"github.com/navidrome/navidrome/conf"
"github.com/navidrome/navidrome/log"
. "github.com/navidrome/navidrome/utils/gg"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/utils/slice"
)
// scannerExternal is a scanner that runs an external process to do the scanning. It is used to avoid
@ -23,19 +25,41 @@ import (
// process will forward them to the caller.
type scannerExternal struct{}
func (s *scannerExternal) scanAll(ctx context.Context, fullScan bool, progress chan<- *ProgressInfo) {
func (s *scannerExternal) scanFolders(ctx context.Context, fullScan bool, targets []model.ScanTarget, progress chan<- *ProgressInfo) {
s.scan(ctx, fullScan, targets, progress)
}
func (s *scannerExternal) scan(ctx context.Context, fullScan bool, targets []model.ScanTarget, progress chan<- *ProgressInfo) {
exe, err := os.Executable()
if err != nil {
progress <- &ProgressInfo{Error: fmt.Sprintf("failed to get executable path: %s", err)}
return
}
log.Debug(ctx, "Spawning external scanner process", "fullScan", fullScan, "path", exe)
cmd := exec.CommandContext(ctx, exe, "scan",
// Build command arguments
args := []string{
"scan",
"--nobanner", "--subprocess",
"--configfile", conf.Server.ConfigFile,
"--datafolder", conf.Server.DataFolder,
"--cachefolder", conf.Server.CacheFolder,
If(fullScan, "--full", ""))
}
// Add targets if provided
if len(targets) > 0 {
targetsStr := strings.Join(slice.Map(targets, func(t model.ScanTarget) string { return t.String() }), ",")
args = append(args, "--targets", targetsStr)
log.Debug(ctx, "Spawning external scanner process with targets", "fullScan", fullScan, "path", exe, "targets", targetsStr)
} else {
log.Debug(ctx, "Spawning external scanner process", "fullScan", fullScan, "path", exe)
}
// Add full scan flag if needed
if fullScan {
args = append(args, "--full")
}
cmd := exec.CommandContext(ctx, exe, args...)
in, out := io.Pipe()
defer in.Close()

View File

@ -15,9 +15,7 @@ import (
"github.com/navidrome/navidrome/utils/chrono"
)
func newFolderEntry(job *scanJob, path string) *folderEntry {
id := model.FolderID(job.lib, path)
info := job.popLastUpdate(id)
func newFolderEntry(job *scanJob, id, path string, updTime time.Time, hash string) *folderEntry {
f := &folderEntry{
id: id,
job: job,
@ -25,8 +23,8 @@ func newFolderEntry(job *scanJob, path string) *folderEntry {
audioFiles: make(map[string]fs.DirEntry),
imageFiles: make(map[string]fs.DirEntry),
albumIDMap: make(map[string]string),
updTime: info.UpdatedAt,
prevHash: info.Hash,
updTime: updTime,
prevHash: hash,
}
return f
}

View File

@ -40,9 +40,8 @@ var _ = Describe("folder_entry", func() {
UpdatedAt: time.Now().Add(-30 * time.Minute),
Hash: "previous-hash",
}
job.lastUpdates[folderID] = updateInfo
entry := newFolderEntry(job, path)
entry := newFolderEntry(job, folderID, path, updateInfo.UpdatedAt, updateInfo.Hash)
Expect(entry.id).To(Equal(folderID))
Expect(entry.job).To(Equal(job))
@ -53,15 +52,10 @@ var _ = Describe("folder_entry", func() {
Expect(entry.updTime).To(Equal(updateInfo.UpdatedAt))
Expect(entry.prevHash).To(Equal(updateInfo.Hash))
})
})
It("creates a new folder entry with zero time when no previous update exists", func() {
entry := newFolderEntry(job, path)
Expect(entry.updTime).To(BeZero())
Expect(entry.prevHash).To(BeEmpty())
})
It("removes the lastUpdate from the job after popping", func() {
Describe("createFolderEntry", func() {
It("removes the lastUpdate from the job after creation", func() {
folderID := model.FolderID(lib, path)
updateInfo := model.FolderUpdateInfo{
UpdatedAt: time.Now().Add(-30 * time.Minute),
@ -69,8 +63,10 @@ var _ = Describe("folder_entry", func() {
}
job.lastUpdates[folderID] = updateInfo
newFolderEntry(job, path)
entry := job.createFolderEntry(path)
Expect(entry.updTime).To(Equal(updateInfo.UpdatedAt))
Expect(entry.prevHash).To(Equal(updateInfo.Hash))
Expect(job.lastUpdates).ToNot(HaveKey(folderID))
})
})
@ -79,7 +75,8 @@ var _ = Describe("folder_entry", func() {
var entry *folderEntry
BeforeEach(func() {
entry = newFolderEntry(job, path)
folderID := model.FolderID(lib, path)
entry = newFolderEntry(job, folderID, path, time.Time{}, "")
})
Describe("hasNoFiles", func() {
@ -458,7 +455,9 @@ var _ = Describe("folder_entry", func() {
Describe("integration scenarios", func() {
It("handles complete folder lifecycle", func() {
// Create new folder entry
entry := newFolderEntry(job, "music/rock/album")
folderPath := "music/rock/album"
folderID := model.FolderID(lib, folderPath)
entry := newFolderEntry(job, folderID, folderPath, time.Time{}, "")
// Initially new and has no files
Expect(entry.isNew()).To(BeTrue())

163
scanner/ignore_checker.go Normal file
View File

@ -0,0 +1,163 @@
package scanner
import (
"bufio"
"context"
"io/fs"
"path"
"strings"
"github.com/navidrome/navidrome/consts"
"github.com/navidrome/navidrome/log"
ignore "github.com/sabhiram/go-gitignore"
)
// IgnoreChecker manages .ndignore patterns using a stack-based approach.
// Use Push() to add patterns when entering a folder, Pop() when leaving,
// and ShouldIgnore() to check if a path should be ignored.
type IgnoreChecker struct {
fsys fs.FS
patternStack [][]string // Stack of patterns for each folder level
currentPatterns []string // Flattened current patterns
matcher *ignore.GitIgnore // Compiled matcher for current patterns
}
// newIgnoreChecker creates a new IgnoreChecker for the given filesystem.
func newIgnoreChecker(fsys fs.FS) *IgnoreChecker {
return &IgnoreChecker{
fsys: fsys,
patternStack: make([][]string, 0),
}
}
// Push loads .ndignore patterns from the specified folder and adds them to the pattern stack.
// Use this when entering a folder during directory tree traversal.
func (ic *IgnoreChecker) Push(ctx context.Context, folder string) error {
patterns := ic.loadPatternsFromFolder(ctx, folder)
ic.patternStack = append(ic.patternStack, patterns)
ic.rebuildCurrentPatterns()
return nil
}
// Pop removes the most recent patterns from the stack.
// Use this when leaving a folder during directory tree traversal.
func (ic *IgnoreChecker) Pop() {
if len(ic.patternStack) > 0 {
ic.patternStack = ic.patternStack[:len(ic.patternStack)-1]
ic.rebuildCurrentPatterns()
}
}
// PushAllParents pushes patterns from root down to the target path.
// This is a convenience method for when you need to check a specific path
// without recursively walking the tree. It handles the common pattern of
// pushing all parent directories from root to the target.
// This method is optimized to compile patterns only once at the end.
func (ic *IgnoreChecker) PushAllParents(ctx context.Context, targetPath string) error {
if targetPath == "." || targetPath == "" {
// Simple case: just push root
return ic.Push(ctx, ".")
}
// Load patterns for root
patterns := ic.loadPatternsFromFolder(ctx, ".")
ic.patternStack = append(ic.patternStack, patterns)
// Load patterns for each parent directory
currentPath := "."
parts := strings.Split(path.Clean(targetPath), "/")
for _, part := range parts {
if part == "." || part == "" {
continue
}
currentPath = path.Join(currentPath, part)
patterns = ic.loadPatternsFromFolder(ctx, currentPath)
ic.patternStack = append(ic.patternStack, patterns)
}
// Rebuild and compile patterns only once at the end
ic.rebuildCurrentPatterns()
return nil
}
// ShouldIgnore checks if the given path should be ignored based on the current patterns.
// Returns true if the path matches any ignore pattern, false otherwise.
func (ic *IgnoreChecker) ShouldIgnore(ctx context.Context, relPath string) bool {
// Handle root/empty path - never ignore
if relPath == "" || relPath == "." {
return false
}
// If no patterns loaded, nothing to ignore
if ic.matcher == nil {
return false
}
matches := ic.matcher.MatchesPath(relPath)
if matches {
log.Trace(ctx, "Scanner: Ignoring entry matching .ndignore", "path", relPath)
}
return matches
}
// loadPatternsFromFolder reads the .ndignore file in the specified folder and returns the patterns.
// If the file doesn't exist, returns an empty slice.
// If the file exists but is empty, returns a pattern to ignore everything ("**/*").
func (ic *IgnoreChecker) loadPatternsFromFolder(ctx context.Context, folder string) []string {
ignoreFilePath := path.Join(folder, consts.ScanIgnoreFile)
var patterns []string
// Check if .ndignore file exists
if _, err := fs.Stat(ic.fsys, ignoreFilePath); err != nil {
// No .ndignore file in this folder
return patterns
}
// Read and parse the .ndignore file
ignoreFile, err := ic.fsys.Open(ignoreFilePath)
if err != nil {
log.Warn(ctx, "Scanner: Error opening .ndignore file", "path", ignoreFilePath, err)
return patterns
}
defer ignoreFile.Close()
lineScanner := bufio.NewScanner(ignoreFile)
for lineScanner.Scan() {
line := strings.TrimSpace(lineScanner.Text())
if line == "" || strings.HasPrefix(line, "#") {
continue // Skip empty lines, whitespace-only lines, and comments
}
patterns = append(patterns, line)
}
if err := lineScanner.Err(); err != nil {
log.Warn(ctx, "Scanner: Error reading .ndignore file", "path", ignoreFilePath, err)
return patterns
}
// If the .ndignore file is empty, ignore everything
if len(patterns) == 0 {
log.Trace(ctx, "Scanner: .ndignore file is empty, ignoring everything", "path", folder)
patterns = []string{"**/*"}
}
return patterns
}
// rebuildCurrentPatterns flattens the pattern stack into currentPatterns and recompiles the matcher.
func (ic *IgnoreChecker) rebuildCurrentPatterns() {
ic.currentPatterns = make([]string, 0)
for _, patterns := range ic.patternStack {
ic.currentPatterns = append(ic.currentPatterns, patterns...)
}
ic.compilePatterns()
}
// compilePatterns compiles the current patterns into a GitIgnore matcher.
func (ic *IgnoreChecker) compilePatterns() {
if len(ic.currentPatterns) == 0 {
ic.matcher = nil
return
}
ic.matcher = ignore.CompileIgnoreLines(ic.currentPatterns...)
}

View File

@ -0,0 +1,313 @@
package scanner
import (
"context"
"testing/fstest"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("IgnoreChecker", func() {
Describe("loadPatternsFromFolder", func() {
var ic *IgnoreChecker
var ctx context.Context
BeforeEach(func() {
ctx = context.Background()
})
Context("when .ndignore file does not exist", func() {
It("should return empty patterns", func() {
fsys := fstest.MapFS{}
ic = newIgnoreChecker(fsys)
patterns := ic.loadPatternsFromFolder(ctx, ".")
Expect(patterns).To(BeEmpty())
})
})
Context("when .ndignore file is empty", func() {
It("should return wildcard to ignore everything", func() {
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte("")},
}
ic = newIgnoreChecker(fsys)
patterns := ic.loadPatternsFromFolder(ctx, ".")
Expect(patterns).To(Equal([]string{"**/*"}))
})
})
DescribeTable("parsing .ndignore content",
func(content string, expectedPatterns []string) {
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte(content)},
}
ic = newIgnoreChecker(fsys)
patterns := ic.loadPatternsFromFolder(ctx, ".")
Expect(patterns).To(Equal(expectedPatterns))
},
Entry("single pattern", "*.txt", []string{"*.txt"}),
Entry("multiple patterns", "*.txt\n*.log", []string{"*.txt", "*.log"}),
Entry("with comments", "# comment\n*.txt\n# another\n*.log", []string{"*.txt", "*.log"}),
Entry("with empty lines", "*.txt\n\n*.log\n\n", []string{"*.txt", "*.log"}),
Entry("mixed content", "# header\n\n*.txt\n# middle\n*.log\n\n", []string{"*.txt", "*.log"}),
Entry("only comments and empty lines", "# comment\n\n# another\n", []string{"**/*"}),
Entry("trailing newline", "*.txt\n*.log\n", []string{"*.txt", "*.log"}),
Entry("directory pattern", "temp/", []string{"temp/"}),
Entry("wildcard pattern", "**/*.mp3", []string{"**/*.mp3"}),
Entry("multiple wildcards", "**/*.mp3\n**/*.flac\n*.log", []string{"**/*.mp3", "**/*.flac", "*.log"}),
Entry("negation pattern", "!important.txt", []string{"!important.txt"}),
Entry("comment with hash not at start is pattern", "not#comment", []string{"not#comment"}),
Entry("whitespace-only lines skipped", "*.txt\n \n*.log\n\t\n", []string{"*.txt", "*.log"}),
Entry("patterns with whitespace trimmed", " *.txt \n\t*.log\t", []string{"*.txt", "*.log"}),
)
})
Describe("Push and Pop", func() {
var ic *IgnoreChecker
var fsys fstest.MapFS
var ctx context.Context
BeforeEach(func() {
ctx = context.Background()
fsys = fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte("*.txt")},
"folder1/.ndignore": &fstest.MapFile{Data: []byte("*.mp3")},
"folder2/.ndignore": &fstest.MapFile{Data: []byte("*.flac")},
}
ic = newIgnoreChecker(fsys)
})
Context("Push", func() {
It("should add patterns to stack", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
Expect(len(ic.patternStack)).To(Equal(1))
Expect(ic.currentPatterns).To(ContainElement("*.txt"))
})
It("should compile matcher after push", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
Expect(ic.matcher).ToNot(BeNil())
})
It("should accumulate patterns from multiple levels", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
err = ic.Push(ctx, "folder1")
Expect(err).ToNot(HaveOccurred())
Expect(len(ic.patternStack)).To(Equal(2))
Expect(ic.currentPatterns).To(ConsistOf("*.txt", "*.mp3"))
})
It("should handle push when no .ndignore exists", func() {
err := ic.Push(ctx, "nonexistent")
Expect(err).ToNot(HaveOccurred())
Expect(len(ic.patternStack)).To(Equal(1))
Expect(ic.currentPatterns).To(BeEmpty())
})
})
Context("Pop", func() {
It("should remove most recent patterns", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
err = ic.Push(ctx, "folder1")
Expect(err).ToNot(HaveOccurred())
ic.Pop()
Expect(len(ic.patternStack)).To(Equal(1))
Expect(ic.currentPatterns).To(Equal([]string{"*.txt"}))
})
It("should handle Pop on empty stack gracefully", func() {
Expect(func() { ic.Pop() }).ToNot(Panic())
Expect(ic.patternStack).To(BeEmpty())
})
It("should set matcher to nil when all patterns popped", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
Expect(ic.matcher).ToNot(BeNil())
ic.Pop()
Expect(ic.matcher).To(BeNil())
})
It("should update matcher after pop", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
err = ic.Push(ctx, "folder1")
Expect(err).ToNot(HaveOccurred())
matcher1 := ic.matcher
ic.Pop()
matcher2 := ic.matcher
Expect(matcher1).ToNot(Equal(matcher2))
})
})
Context("multiple Push/Pop cycles", func() {
It("should maintain correct state through cycles", func() {
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
Expect(ic.currentPatterns).To(Equal([]string{"*.txt"}))
err = ic.Push(ctx, "folder1")
Expect(err).ToNot(HaveOccurred())
Expect(ic.currentPatterns).To(ConsistOf("*.txt", "*.mp3"))
ic.Pop()
Expect(ic.currentPatterns).To(Equal([]string{"*.txt"}))
err = ic.Push(ctx, "folder2")
Expect(err).ToNot(HaveOccurred())
Expect(ic.currentPatterns).To(ConsistOf("*.txt", "*.flac"))
ic.Pop()
Expect(ic.currentPatterns).To(Equal([]string{"*.txt"}))
ic.Pop()
Expect(ic.currentPatterns).To(BeEmpty())
})
})
})
Describe("PushAllParents", func() {
var ic *IgnoreChecker
var ctx context.Context
BeforeEach(func() {
ctx = context.Background()
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte("root.txt")},
"folder1/.ndignore": &fstest.MapFile{Data: []byte("level1.txt")},
"folder1/folder2/.ndignore": &fstest.MapFile{Data: []byte("level2.txt")},
"folder1/folder2/folder3/.ndignore": &fstest.MapFile{Data: []byte("level3.txt")},
}
ic = newIgnoreChecker(fsys)
})
DescribeTable("loading parent patterns",
func(targetPath string, expectedStackDepth int, expectedPatterns []string) {
err := ic.PushAllParents(ctx, targetPath)
Expect(err).ToNot(HaveOccurred())
Expect(len(ic.patternStack)).To(Equal(expectedStackDepth))
Expect(ic.currentPatterns).To(ConsistOf(expectedPatterns))
},
Entry("root path", ".", 1, []string{"root.txt"}),
Entry("empty path", "", 1, []string{"root.txt"}),
Entry("single level", "folder1", 2, []string{"root.txt", "level1.txt"}),
Entry("two levels", "folder1/folder2", 3, []string{"root.txt", "level1.txt", "level2.txt"}),
Entry("three levels", "folder1/folder2/folder3", 4, []string{"root.txt", "level1.txt", "level2.txt", "level3.txt"}),
)
It("should only compile patterns once at the end", func() {
// This is more of a behavioral test - we verify the matcher is not nil after PushAllParents
err := ic.PushAllParents(ctx, "folder1/folder2")
Expect(err).ToNot(HaveOccurred())
Expect(ic.matcher).ToNot(BeNil())
})
It("should handle paths with dot", func() {
err := ic.PushAllParents(ctx, "./folder1")
Expect(err).ToNot(HaveOccurred())
Expect(len(ic.patternStack)).To(Equal(2))
})
Context("when some parent folders have no .ndignore", func() {
BeforeEach(func() {
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte("root.txt")},
"folder1/folder2/.ndignore": &fstest.MapFile{Data: []byte("level2.txt")},
}
ic = newIgnoreChecker(fsys)
})
It("should still push all parent levels", func() {
err := ic.PushAllParents(ctx, "folder1/folder2")
Expect(err).ToNot(HaveOccurred())
Expect(len(ic.patternStack)).To(Equal(3)) // root, folder1 (empty), folder2
Expect(ic.currentPatterns).To(ConsistOf("root.txt", "level2.txt"))
})
})
})
Describe("ShouldIgnore", func() {
var ic *IgnoreChecker
var ctx context.Context
BeforeEach(func() {
ctx = context.Background()
})
Context("with no patterns loaded", func() {
It("should not ignore any path", func() {
fsys := fstest.MapFS{}
ic = newIgnoreChecker(fsys)
Expect(ic.ShouldIgnore(ctx, "anything.txt")).To(BeFalse())
Expect(ic.ShouldIgnore(ctx, "folder/file.mp3")).To(BeFalse())
})
})
Context("special paths", func() {
BeforeEach(func() {
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte("**/*")},
}
ic = newIgnoreChecker(fsys)
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
})
It("should never ignore root or empty paths", func() {
Expect(ic.ShouldIgnore(ctx, "")).To(BeFalse())
Expect(ic.ShouldIgnore(ctx, ".")).To(BeFalse())
})
It("should ignore all other paths with wildcard", func() {
Expect(ic.ShouldIgnore(ctx, "file.txt")).To(BeTrue())
Expect(ic.ShouldIgnore(ctx, "folder/file.mp3")).To(BeTrue())
})
})
DescribeTable("pattern matching",
func(pattern string, path string, shouldMatch bool) {
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte(pattern)},
}
ic = newIgnoreChecker(fsys)
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
Expect(ic.ShouldIgnore(ctx, path)).To(Equal(shouldMatch))
},
Entry("glob match", "*.txt", "file.txt", true),
Entry("glob no match", "*.txt", "file.mp3", false),
Entry("directory pattern match", "tmp/", "tmp/file.txt", true),
Entry("directory pattern no match", "tmp/", "temporary/file.txt", false),
Entry("nested glob match", "**/*.log", "deep/nested/file.log", true),
Entry("nested glob no match", "**/*.log", "deep/nested/file.txt", false),
Entry("specific file match", "ignore.me", "ignore.me", true),
Entry("specific file no match", "ignore.me", "keep.me", false),
Entry("wildcard all", "**/*", "any/path/file.txt", true),
Entry("nested specific match", "temp/*", "temp/cache.db", true),
Entry("nested specific no match", "temp/*", "temporary/cache.db", false),
)
Context("with multiple patterns", func() {
BeforeEach(func() {
fsys := fstest.MapFS{
".ndignore": &fstest.MapFile{Data: []byte("*.txt\n*.log\ntemp/")},
}
ic = newIgnoreChecker(fsys)
err := ic.Push(ctx, ".")
Expect(err).ToNot(HaveOccurred())
})
It("should match any of the patterns", func() {
Expect(ic.ShouldIgnore(ctx, "file.txt")).To(BeTrue())
Expect(ic.ShouldIgnore(ctx, "debug.log")).To(BeTrue())
Expect(ic.ShouldIgnore(ctx, "temp/cache")).To(BeTrue())
Expect(ic.ShouldIgnore(ctx, "music.mp3")).To(BeFalse())
})
})
})
})

View File

@ -26,58 +26,46 @@ import (
"github.com/navidrome/navidrome/utils/slice"
)
func createPhaseFolders(ctx context.Context, state *scanState, ds model.DataStore, cw artwork.CacheWarmer, libs []model.Library) *phaseFolders {
func createPhaseFolders(ctx context.Context, state *scanState, ds model.DataStore, cw artwork.CacheWarmer) *phaseFolders {
var jobs []*scanJob
var updatedLibs []model.Library
for _, lib := range libs {
if lib.LastScanStartedAt.IsZero() {
err := ds.Library(ctx).ScanBegin(lib.ID, state.fullScan)
if err != nil {
log.Error(ctx, "Scanner: Error updating last scan started at", "lib", lib.Name, err)
state.sendWarning(err.Error())
continue
}
// Reload library to get updated state
l, err := ds.Library(ctx).Get(lib.ID)
if err != nil {
log.Error(ctx, "Scanner: Error reloading library", "lib", lib.Name, err)
state.sendWarning(err.Error())
continue
}
lib = *l
} else {
log.Debug(ctx, "Scanner: Resuming previous scan", "lib", lib.Name, "lastScanStartedAt", lib.LastScanStartedAt, "fullScan", lib.FullScanInProgress)
// Create scan jobs for all libraries
for _, lib := range state.libraries {
// Get target folders for this library if selective scan
var targetFolders []string
if state.isSelectiveScan() {
targetFolders = state.targets[lib.ID]
}
job, err := newScanJob(ctx, ds, cw, lib, state.fullScan)
job, err := newScanJob(ctx, ds, cw, lib, state.fullScan, targetFolders)
if err != nil {
log.Error(ctx, "Scanner: Error creating scan context", "lib", lib.Name, err)
state.sendWarning(err.Error())
continue
}
jobs = append(jobs, job)
updatedLibs = append(updatedLibs, lib)
}
// Update the state with the libraries that have been processed and have their scan timestamps set
state.libraries = updatedLibs
return &phaseFolders{jobs: jobs, ctx: ctx, ds: ds, state: state}
}
type scanJob struct {
lib model.Library
fs storage.MusicFS
cw artwork.CacheWarmer
lastUpdates map[string]model.FolderUpdateInfo
lock sync.Mutex
numFolders atomic.Int64
lib model.Library
fs storage.MusicFS
cw artwork.CacheWarmer
lastUpdates map[string]model.FolderUpdateInfo // Holds last update info for all (DB) folders in this library
targetFolders []string // Specific folders to scan (including all descendants)
lock sync.Mutex
numFolders atomic.Int64
}
func newScanJob(ctx context.Context, ds model.DataStore, cw artwork.CacheWarmer, lib model.Library, fullScan bool) (*scanJob, error) {
lastUpdates, err := ds.Folder(ctx).GetLastUpdates(lib)
func newScanJob(ctx context.Context, ds model.DataStore, cw artwork.CacheWarmer, lib model.Library, fullScan bool, targetFolders []string) (*scanJob, error) {
// Get folder updates, optionally filtered to specific target folders
lastUpdates, err := ds.Folder(ctx).GetFolderUpdateInfo(lib, targetFolders...)
if err != nil {
return nil, fmt.Errorf("getting last updates: %w", err)
}
fileStore, err := storage.For(lib.Path)
if err != nil {
log.Error(ctx, "Error getting storage for library", "library", lib.Name, "path", lib.Path, err)
@ -88,15 +76,17 @@ func newScanJob(ctx context.Context, ds model.DataStore, cw artwork.CacheWarmer,
log.Error(ctx, "Error getting fs for library", "library", lib.Name, "path", lib.Path, err)
return nil, fmt.Errorf("getting fs for library: %w", err)
}
lib.FullScanInProgress = lib.FullScanInProgress || fullScan
return &scanJob{
lib: lib,
fs: fsys,
cw: cw,
lastUpdates: lastUpdates,
lib: lib,
fs: fsys,
cw: cw,
lastUpdates: lastUpdates,
targetFolders: targetFolders,
}, nil
}
// popLastUpdate retrieves and removes the last update info for the given folder ID
// This is used to track which folders have been found during the walk_dir_tree
func (j *scanJob) popLastUpdate(folderID string) model.FolderUpdateInfo {
j.lock.Lock()
defer j.lock.Unlock()
@ -106,6 +96,15 @@ func (j *scanJob) popLastUpdate(folderID string) model.FolderUpdateInfo {
return lastUpdate
}
// createFolderEntry creates a new folderEntry for the given path, using the last update info from the job
// to populate the previous update time and hash. It also removes the folder from the job's lastUpdates map.
// This is used to track which folders have been found during the walk_dir_tree.
func (j *scanJob) createFolderEntry(path string) *folderEntry {
id := model.FolderID(j.lib, path)
info := j.popLastUpdate(id)
return newFolderEntry(j, id, path, info.UpdatedAt, info.Hash)
}
// phaseFolders represents the first phase of the scanning process, which is responsible
// for scanning all libraries and importing new or updated files. This phase involves
// traversing the directory tree of each library, identifying new or modified media files,
@ -144,7 +143,8 @@ func (p *phaseFolders) producer() ppl.Producer[*folderEntry] {
if utils.IsCtxDone(p.ctx) {
break
}
outputChan, err := walkDirTree(p.ctx, job)
outputChan, err := walkDirTree(p.ctx, job, job.targetFolders...)
if err != nil {
log.Warn(p.ctx, "Scanner: Error scanning library", "lib", job.lib.Name, err)
}

View File

@ -69,9 +69,6 @@ func (p *phaseMissingTracks) produce(put func(tracks *missingTracks)) error {
}
}
for _, lib := range p.state.libraries {
if lib.LastScanStartedAt.IsZero() {
continue
}
log.Debug(p.ctx, "Scanner: Checking missing tracks", "libraryId", lib.ID, "libraryName", lib.Name)
cursor, err := p.ds.MediaFile(p.ctx).GetMissingAndMatching(lib.ID)
if err != nil {

View File

@ -27,14 +27,13 @@ import (
type phaseRefreshAlbums struct {
ds model.DataStore
ctx context.Context
libs model.Libraries
refreshed atomic.Uint32
skipped atomic.Uint32
state *scanState
}
func createPhaseRefreshAlbums(ctx context.Context, state *scanState, ds model.DataStore, libs model.Libraries) *phaseRefreshAlbums {
return &phaseRefreshAlbums{ctx: ctx, ds: ds, libs: libs, state: state}
func createPhaseRefreshAlbums(ctx context.Context, state *scanState, ds model.DataStore) *phaseRefreshAlbums {
return &phaseRefreshAlbums{ctx: ctx, ds: ds, state: state}
}
func (p *phaseRefreshAlbums) description() string {
@ -47,7 +46,7 @@ func (p *phaseRefreshAlbums) producer() ppl.Producer[*model.Album] {
func (p *phaseRefreshAlbums) produce(put func(album *model.Album)) error {
count := 0
for _, lib := range p.libs {
for _, lib := range p.state.libraries {
cursor, err := p.ds.Album(p.ctx).GetTouchedAlbums(lib.ID)
if err != nil {
return fmt.Errorf("loading touched albums: %w", err)

View File

@ -32,8 +32,8 @@ var _ = Describe("phaseRefreshAlbums", func() {
{ID: 1, Name: "Library 1"},
{ID: 2, Name: "Library 2"},
}
state = &scanState{}
phase = createPhaseRefreshAlbums(ctx, state, ds, libs)
state = &scanState{libraries: libs}
phase = createPhaseRefreshAlbums(ctx, state, ds)
})
Describe("description", func() {

View File

@ -3,6 +3,8 @@ package scanner
import (
"context"
"fmt"
"maps"
"slices"
"sync/atomic"
"time"
@ -15,6 +17,7 @@ import (
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/utils/run"
"github.com/navidrome/navidrome/utils/slice"
)
type scannerImpl struct {
@ -28,7 +31,8 @@ type scanState struct {
progress chan<- *ProgressInfo
fullScan bool
changesDetected atomic.Bool
libraries model.Libraries // Store libraries list for consistency across phases
libraries model.Libraries // Store libraries list for consistency across phases
targets map[int][]string // Optional: map[libraryID][]folderPaths for selective scans
}
func (s *scanState) sendProgress(info *ProgressInfo) {
@ -37,6 +41,10 @@ func (s *scanState) sendProgress(info *ProgressInfo) {
}
}
func (s *scanState) isSelectiveScan() bool {
return len(s.targets) > 0
}
func (s *scanState) sendWarning(msg string) {
s.sendProgress(&ProgressInfo{Warning: msg})
}
@ -45,7 +53,7 @@ func (s *scanState) sendError(err error) {
s.sendProgress(&ProgressInfo{Error: err.Error()})
}
func (s *scannerImpl) scanAll(ctx context.Context, fullScan bool, progress chan<- *ProgressInfo) {
func (s *scannerImpl) scanFolders(ctx context.Context, fullScan bool, targets []model.ScanTarget, progress chan<- *ProgressInfo) {
startTime := time.Now()
state := scanState{
@ -59,38 +67,75 @@ func (s *scannerImpl) scanAll(ctx context.Context, fullScan bool, progress chan<
state.changesDetected.Store(true)
}
libs, err := s.ds.Library(ctx).GetAll()
// Get libraries and optionally filter by targets
allLibs, err := s.ds.Library(ctx).GetAll()
if err != nil {
state.sendWarning(fmt.Sprintf("getting libraries: %s", err))
return
}
state.libraries = libs
log.Info(ctx, "Scanner: Starting scan", "fullScan", state.fullScan, "numLibraries", len(libs))
if len(targets) > 0 {
// Selective scan: filter libraries and build targets map
state.targets = make(map[int][]string)
for _, target := range targets {
folderPath := target.FolderPath
if folderPath == "" {
folderPath = "."
}
state.targets[target.LibraryID] = append(state.targets[target.LibraryID], folderPath)
}
// Filter libraries to only those in targets
state.libraries = slice.Filter(allLibs, func(lib model.Library) bool {
return len(state.targets[lib.ID]) > 0
})
log.Info(ctx, "Scanner: Starting selective scan", "fullScan", state.fullScan, "numLibraries", len(state.libraries), "numTargets", len(targets))
} else {
// Full library scan
state.libraries = allLibs
log.Info(ctx, "Scanner: Starting scan", "fullScan", state.fullScan, "numLibraries", len(state.libraries))
}
// Store scan type and start time
scanType := "quick"
if state.fullScan {
scanType = "full"
}
if state.isSelectiveScan() {
scanType += "-selective"
}
_ = s.ds.Property(ctx).Put(consts.LastScanTypeKey, scanType)
_ = s.ds.Property(ctx).Put(consts.LastScanStartTimeKey, startTime.Format(time.RFC3339))
// if there was a full scan in progress, force a full scan
if !state.fullScan {
for _, lib := range libs {
for _, lib := range state.libraries {
if lib.FullScanInProgress {
log.Info(ctx, "Scanner: Interrupted full scan detected", "lib", lib.Name)
state.fullScan = true
_ = s.ds.Property(ctx).Put(consts.LastScanTypeKey, "full")
if state.isSelectiveScan() {
_ = s.ds.Property(ctx).Put(consts.LastScanTypeKey, "full-selective")
} else {
_ = s.ds.Property(ctx).Put(consts.LastScanTypeKey, "full")
}
break
}
}
}
// Prepare libraries for scanning (initialize LastScanStartedAt if needed)
err = s.prepareLibrariesForScan(ctx, &state)
if err != nil {
log.Error(ctx, "Scanner: Error preparing libraries for scan", err)
state.sendError(err)
return
}
err = run.Sequentially(
// Phase 1: Scan all libraries and import new/updated files
runPhase[*folderEntry](ctx, 1, createPhaseFolders(ctx, &state, s.ds, s.cw, libs)),
runPhase[*folderEntry](ctx, 1, createPhaseFolders(ctx, &state, s.ds, s.cw)),
// Phase 2: Process missing files, checking for moves
runPhase[*missingTracks](ctx, 2, createPhaseMissingTracks(ctx, &state, s.ds)),
@ -98,7 +143,7 @@ func (s *scannerImpl) scanAll(ctx context.Context, fullScan bool, progress chan<
// Phases 3 and 4 can be run in parallel
run.Parallel(
// Phase 3: Refresh all new/changed albums and update artists
runPhase[*model.Album](ctx, 3, createPhaseRefreshAlbums(ctx, &state, s.ds, libs)),
runPhase[*model.Album](ctx, 3, createPhaseRefreshAlbums(ctx, &state, s.ds)),
// Phase 4: Import/update playlists
runPhase[*model.Folder](ctx, 4, createPhasePlaylists(ctx, &state, s.ds, s.pls, s.cw)),
@ -131,7 +176,53 @@ func (s *scannerImpl) scanAll(ctx context.Context, fullScan bool, progress chan<
state.sendProgress(&ProgressInfo{ChangesDetected: true})
}
log.Info(ctx, "Scanner: Finished scanning all libraries", "duration", time.Since(startTime))
if state.isSelectiveScan() {
log.Info(ctx, "Scanner: Finished scanning selected folders", "duration", time.Since(startTime), "numTargets", len(targets))
} else {
log.Info(ctx, "Scanner: Finished scanning all libraries", "duration", time.Since(startTime))
}
}
// prepareLibrariesForScan initializes the scan for all libraries in the state.
// It calls ScanBegin for libraries that haven't started scanning yet (LastScanStartedAt is zero),
// reloads them to get the updated state, and filters out any libraries that fail to initialize.
func (s *scannerImpl) prepareLibrariesForScan(ctx context.Context, state *scanState) error {
var successfulLibs []model.Library
for _, lib := range state.libraries {
if lib.LastScanStartedAt.IsZero() {
// This is a new scan - mark it as started
err := s.ds.Library(ctx).ScanBegin(lib.ID, state.fullScan)
if err != nil {
log.Error(ctx, "Scanner: Error marking scan start", "lib", lib.Name, err)
state.sendWarning(err.Error())
continue
}
// Reload library to get updated state (timestamps, etc.)
reloadedLib, err := s.ds.Library(ctx).Get(lib.ID)
if err != nil {
log.Error(ctx, "Scanner: Error reloading library", "lib", lib.Name, err)
state.sendWarning(err.Error())
continue
}
lib = *reloadedLib
} else {
// This is a resumed scan
log.Debug(ctx, "Scanner: Resuming previous scan", "lib", lib.Name,
"lastScanStartedAt", lib.LastScanStartedAt, "fullScan", lib.FullScanInProgress)
}
successfulLibs = append(successfulLibs, lib)
}
if len(successfulLibs) == 0 {
return fmt.Errorf("no libraries available for scanning")
}
// Update state with only successfully initialized libraries
state.libraries = successfulLibs
return nil
}
func (s *scannerImpl) runGC(ctx context.Context, state *scanState) func() error {
@ -140,7 +231,15 @@ func (s *scannerImpl) runGC(ctx context.Context, state *scanState) func() error
return s.ds.WithTx(func(tx model.DataStore) error {
if state.changesDetected.Load() {
start := time.Now()
err := tx.GC(ctx)
// For selective scans, extract library IDs to scope GC operations
var libraryIDs []int
if state.isSelectiveScan() {
libraryIDs = slices.Collect(maps.Keys(state.targets))
log.Debug(ctx, "Scanner: Running selective GC", "libraryIDs", libraryIDs)
}
err := tx.GC(ctx, libraryIDs...)
if err != nil {
log.Error(ctx, "Scanner: Error running GC", err)
return fmt.Errorf("running GC: %w", err)

View File

@ -32,7 +32,7 @@ var _ = Describe("Scanner - Multi-Library", Ordered, func() {
var ctx context.Context
var lib1, lib2 model.Library
var ds *tests.MockDataStore
var s scanner.Scanner
var s model.Scanner
createFS := func(path string, files fstest.MapFS) storagetest.FakeFS {
fs := storagetest.FakeFS{}

View File

@ -0,0 +1,293 @@
package scanner_test
import (
"context"
"path/filepath"
"testing/fstest"
"github.com/Masterminds/squirrel"
"github.com/navidrome/navidrome/conf"
"github.com/navidrome/navidrome/conf/configtest"
"github.com/navidrome/navidrome/core"
"github.com/navidrome/navidrome/core/artwork"
"github.com/navidrome/navidrome/core/metrics"
"github.com/navidrome/navidrome/core/storage/storagetest"
"github.com/navidrome/navidrome/db"
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/model/request"
"github.com/navidrome/navidrome/persistence"
"github.com/navidrome/navidrome/scanner"
"github.com/navidrome/navidrome/server/events"
"github.com/navidrome/navidrome/tests"
"github.com/navidrome/navidrome/utils/slice"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("ScanFolders", Ordered, func() {
var ctx context.Context
var lib model.Library
var ds model.DataStore
var s model.Scanner
var fsys storagetest.FakeFS
BeforeAll(func() {
ctx = request.WithUser(GinkgoT().Context(), model.User{ID: "123", IsAdmin: true})
tmpDir := GinkgoT().TempDir()
conf.Server.DbPath = filepath.Join(tmpDir, "test-selective-scan.db?_journal_mode=WAL")
log.Warn("Using DB at " + conf.Server.DbPath)
db.Db().SetMaxOpenConns(1)
})
BeforeEach(func() {
DeferCleanup(configtest.SetupConfig())
conf.Server.MusicFolder = "fake:///music"
conf.Server.DevExternalScanner = false
db.Init(ctx)
DeferCleanup(func() {
Expect(tests.ClearDB()).To(Succeed())
})
ds = persistence.New(db.Db())
// Create the admin user in the database to match the context
adminUser := model.User{
ID: "123",
UserName: "admin",
Name: "Admin User",
IsAdmin: true,
NewPassword: "password",
}
Expect(ds.User(ctx).Put(&adminUser)).To(Succeed())
s = scanner.New(ctx, ds, artwork.NoopCacheWarmer(), events.NoopBroker(),
core.NewPlaylists(ds), metrics.NewNoopInstance())
lib = model.Library{ID: 1, Name: "Fake Library", Path: "fake:///music"}
Expect(ds.Library(ctx).Put(&lib)).To(Succeed())
// Initialize fake filesystem
fsys = storagetest.FakeFS{}
storagetest.Register("fake", &fsys)
})
Describe("Adding tracks to the library", func() {
It("scans specified folders recursively including all subdirectories", func() {
rock := template(_t{"albumartist": "Rock Artist", "album": "Rock Album"})
jazz := template(_t{"albumartist": "Jazz Artist", "album": "Jazz Album"})
pop := template(_t{"albumartist": "Pop Artist", "album": "Pop Album"})
createFS(fstest.MapFS{
"rock/track1.mp3": rock(track(1, "Rock Track 1")),
"rock/track2.mp3": rock(track(2, "Rock Track 2")),
"rock/subdir/track3.mp3": rock(track(3, "Rock Track 3")),
"jazz/track4.mp3": jazz(track(1, "Jazz Track 1")),
"jazz/subdir/track5.mp3": jazz(track(2, "Jazz Track 2")),
"pop/track6.mp3": pop(track(1, "Pop Track 1")),
})
// Scan only the "rock" and "jazz" folders (including their subdirectories)
targets := []model.ScanTarget{
{LibraryID: lib.ID, FolderPath: "rock"},
{LibraryID: lib.ID, FolderPath: "jazz"},
}
warnings, err := s.ScanFolders(ctx, false, targets)
Expect(err).ToNot(HaveOccurred())
Expect(warnings).To(BeEmpty())
// Verify all tracks in rock and jazz folders (including subdirectories) were imported
allFiles, err := ds.MediaFile(ctx).GetAll()
Expect(err).ToNot(HaveOccurred())
// Should have 5 tracks (all rock and jazz tracks including subdirectories)
Expect(allFiles).To(HaveLen(5))
// Get the file paths
paths := slice.Map(allFiles, func(mf model.MediaFile) string {
return filepath.ToSlash(mf.Path)
})
// Verify the correct files were scanned (including subdirectories)
Expect(paths).To(ContainElements(
"rock/track1.mp3",
"rock/track2.mp3",
"rock/subdir/track3.mp3",
"jazz/track4.mp3",
"jazz/subdir/track5.mp3",
))
// Verify files in the pop folder were NOT scanned
Expect(paths).ToNot(ContainElement("pop/track6.mp3"))
})
})
Describe("Deleting folders", func() {
Context("when a child folder is deleted", func() {
var (
revolver, help func(...map[string]any) *fstest.MapFile
artistFolderID string
album1FolderID string
album2FolderID string
album1TrackIDs []string
album2TrackIDs []string
)
BeforeEach(func() {
// Setup template functions for creating test files
revolver = storagetest.Template(_t{"albumartist": "The Beatles", "album": "Revolver", "year": 1966})
help = storagetest.Template(_t{"albumartist": "The Beatles", "album": "Help!", "year": 1965})
// Initial filesystem with nested folders
fsys.SetFiles(fstest.MapFS{
"The Beatles/Revolver/01 - Taxman.mp3": revolver(storagetest.Track(1, "Taxman")),
"The Beatles/Revolver/02 - Eleanor Rigby.mp3": revolver(storagetest.Track(2, "Eleanor Rigby")),
"The Beatles/Help!/01 - Help!.mp3": help(storagetest.Track(1, "Help!")),
"The Beatles/Help!/02 - The Night Before.mp3": help(storagetest.Track(2, "The Night Before")),
})
// First scan - import everything
_, err := s.ScanAll(ctx, true)
Expect(err).ToNot(HaveOccurred())
// Verify initial state - all folders exist
folders, err := ds.Folder(ctx).GetAll(model.QueryOptions{Filters: squirrel.Eq{"library_id": lib.ID}})
Expect(err).ToNot(HaveOccurred())
Expect(folders).To(HaveLen(4)) // root, Artist, Album1, Album2
// Store folder IDs for later verification
for _, f := range folders {
switch f.Name {
case "The Beatles":
artistFolderID = f.ID
case "Revolver":
album1FolderID = f.ID
case "Help!":
album2FolderID = f.ID
}
}
// Verify all tracks exist
allTracks, err := ds.MediaFile(ctx).GetAll()
Expect(err).ToNot(HaveOccurred())
Expect(allTracks).To(HaveLen(4))
// Store track IDs for later verification
for _, t := range allTracks {
if t.Album == "Revolver" {
album1TrackIDs = append(album1TrackIDs, t.ID)
} else if t.Album == "Help!" {
album2TrackIDs = append(album2TrackIDs, t.ID)
}
}
// Verify no tracks are missing initially
for _, t := range allTracks {
Expect(t.Missing).To(BeFalse())
}
})
It("should mark child folder and its tracks as missing when parent is scanned", func() {
// Delete the child folder (Help!) from the filesystem
fsys.SetFiles(fstest.MapFS{
"The Beatles/Revolver/01 - Taxman.mp3": revolver(storagetest.Track(1, "Taxman")),
"The Beatles/Revolver/02 - Eleanor Rigby.mp3": revolver(storagetest.Track(2, "Eleanor Rigby")),
// "The Beatles/Help!" folder and its contents are DELETED
})
// Run selective scan on the parent folder (Artist)
// This simulates what the watcher does when a child folder is deleted
_, err := s.ScanFolders(ctx, false, []model.ScanTarget{
{LibraryID: lib.ID, FolderPath: "The Beatles"},
})
Expect(err).ToNot(HaveOccurred())
// Verify the deleted child folder is now marked as missing
deletedFolder, err := ds.Folder(ctx).Get(album2FolderID)
Expect(err).ToNot(HaveOccurred())
Expect(deletedFolder.Missing).To(BeTrue(), "Deleted child folder should be marked as missing")
// Verify the deleted folder's tracks are marked as missing
for _, trackID := range album2TrackIDs {
track, err := ds.MediaFile(ctx).Get(trackID)
Expect(err).ToNot(HaveOccurred())
Expect(track.Missing).To(BeTrue(), "Track in deleted folder should be marked as missing")
}
// Verify the parent folder is still present and not marked as missing
parentFolder, err := ds.Folder(ctx).Get(artistFolderID)
Expect(err).ToNot(HaveOccurred())
Expect(parentFolder.Missing).To(BeFalse(), "Parent folder should not be marked as missing")
// Verify the sibling folder and its tracks are still present and not missing
siblingFolder, err := ds.Folder(ctx).Get(album1FolderID)
Expect(err).ToNot(HaveOccurred())
Expect(siblingFolder.Missing).To(BeFalse(), "Sibling folder should not be marked as missing")
for _, trackID := range album1TrackIDs {
track, err := ds.MediaFile(ctx).Get(trackID)
Expect(err).ToNot(HaveOccurred())
Expect(track.Missing).To(BeFalse(), "Track in sibling folder should not be marked as missing")
}
})
It("should mark deeply nested child folders as missing", func() {
// Add a deeply nested folder structure
fsys.SetFiles(fstest.MapFS{
"The Beatles/Revolver/01 - Taxman.mp3": revolver(storagetest.Track(1, "Taxman")),
"The Beatles/Revolver/02 - Eleanor Rigby.mp3": revolver(storagetest.Track(2, "Eleanor Rigby")),
"The Beatles/Help!/01 - Help!.mp3": help(storagetest.Track(1, "Help!")),
"The Beatles/Help!/02 - The Night Before.mp3": help(storagetest.Track(2, "The Night Before")),
"The Beatles/Help!/Bonus/01 - Bonus Track.mp3": help(storagetest.Track(99, "Bonus Track")),
"The Beatles/Help!/Bonus/Nested/01 - Deep Track.mp3": help(storagetest.Track(100, "Deep Track")),
})
// Rescan to import the new nested structure
_, err := s.ScanAll(ctx, true)
Expect(err).ToNot(HaveOccurred())
// Verify nested folders were created
allFolders, err := ds.Folder(ctx).GetAll(model.QueryOptions{Filters: squirrel.Eq{"library_id": lib.ID}})
Expect(err).ToNot(HaveOccurred())
Expect(len(allFolders)).To(BeNumerically(">", 4), "Should have more folders with nested structure")
// Now delete the entire Help! folder including nested children
fsys.SetFiles(fstest.MapFS{
"The Beatles/Revolver/01 - Taxman.mp3": revolver(storagetest.Track(1, "Taxman")),
"The Beatles/Revolver/02 - Eleanor Rigby.mp3": revolver(storagetest.Track(2, "Eleanor Rigby")),
// All Help! subfolders are deleted
})
// Run selective scan on parent
_, err = s.ScanFolders(ctx, false, []model.ScanTarget{
{LibraryID: lib.ID, FolderPath: "The Beatles"},
})
Expect(err).ToNot(HaveOccurred())
// Verify all Help! folders (including nested ones) are marked as missing
missingFolders, err := ds.Folder(ctx).GetAll(model.QueryOptions{
Filters: squirrel.And{
squirrel.Eq{"library_id": lib.ID},
squirrel.Eq{"missing": true},
},
})
Expect(err).ToNot(HaveOccurred())
Expect(len(missingFolders)).To(BeNumerically(">", 0), "At least one folder should be marked as missing")
// Verify all tracks in deleted folders are marked as missing
allTracks, err := ds.MediaFile(ctx).GetAll()
Expect(err).ToNot(HaveOccurred())
Expect(allTracks).To(HaveLen(6))
for _, track := range allTracks {
if track.Album == "Help!" {
Expect(track.Missing).To(BeTrue(), "All tracks in deleted Help! folder should be marked as missing")
} else if track.Album == "Revolver" {
Expect(track.Missing).To(BeFalse(), "Tracks in Revolver folder should not be marked as missing")
}
}
})
})
})
})

View File

@ -34,19 +34,19 @@ type _t = map[string]any
var template = storagetest.Template
var track = storagetest.Track
func createFS(files fstest.MapFS) storagetest.FakeFS {
fs := storagetest.FakeFS{}
fs.SetFiles(files)
storagetest.Register("fake", &fs)
return fs
}
var _ = Describe("Scanner", Ordered, func() {
var ctx context.Context
var lib model.Library
var ds *tests.MockDataStore
var mfRepo *mockMediaFileRepo
var s scanner.Scanner
createFS := func(files fstest.MapFS) storagetest.FakeFS {
fs := storagetest.FakeFS{}
fs.SetFiles(files)
storagetest.Register("fake", &fs)
return fs
}
var s model.Scanner
BeforeAll(func() {
ctx = request.WithUser(GinkgoT().Context(), model.User{ID: "123", IsAdmin: true})
@ -478,6 +478,56 @@ var _ = Describe("Scanner", Ordered, func() {
Expect(mf.Missing).To(BeFalse())
})
It("marks tracks as missing when scanning a deleted folder with ScanFolders", func() {
By("Adding a third track to Revolver to have more test data")
fsys.Add("The Beatles/Revolver/03 - I'm Only Sleeping.mp3", revolver(track(3, "I'm Only Sleeping")))
Expect(runScanner(ctx, false)).To(Succeed())
By("Verifying initial state has 5 tracks")
Expect(ds.MediaFile(ctx).CountAll(model.QueryOptions{
Filters: squirrel.Eq{"missing": false},
})).To(Equal(int64(5)))
By("Removing the entire Revolver folder from filesystem")
fsys.Remove("The Beatles/Revolver/01 - Taxman.mp3")
fsys.Remove("The Beatles/Revolver/02 - Eleanor Rigby.mp3")
fsys.Remove("The Beatles/Revolver/03 - I'm Only Sleeping.mp3")
By("Scanning the parent folder (simulating watcher behavior)")
targets := []model.ScanTarget{
{LibraryID: lib.ID, FolderPath: "The Beatles"},
}
_, err := s.ScanFolders(ctx, false, targets)
Expect(err).To(Succeed())
By("Checking all Revolver tracks are marked as missing")
mf, err := findByPath("The Beatles/Revolver/01 - Taxman.mp3")
Expect(err).ToNot(HaveOccurred())
Expect(mf.Missing).To(BeTrue())
mf, err = findByPath("The Beatles/Revolver/02 - Eleanor Rigby.mp3")
Expect(err).ToNot(HaveOccurred())
Expect(mf.Missing).To(BeTrue())
mf, err = findByPath("The Beatles/Revolver/03 - I'm Only Sleeping.mp3")
Expect(err).ToNot(HaveOccurred())
Expect(mf.Missing).To(BeTrue())
By("Checking the Help! tracks are not affected")
mf, err = findByPath("The Beatles/Help!/01 - Help!.mp3")
Expect(err).ToNot(HaveOccurred())
Expect(mf.Missing).To(BeFalse())
mf, err = findByPath("The Beatles/Help!/02 - The Night Before.mp3")
Expect(err).ToNot(HaveOccurred())
Expect(mf.Missing).To(BeFalse())
By("Verifying only 2 non-missing tracks remain (Help! tracks)")
Expect(ds.MediaFile(ctx).CountAll(model.QueryOptions{
Filters: squirrel.Eq{"missing": false},
})).To(Equal(int64(2)))
})
It("does not override artist fields when importing an undertagged file", func() {
By("Making sure artist in the DB contains MBID and sort name")
aa, err := ds.Artist(ctx).GetAll(model.QueryOptions{

View File

@ -1,7 +1,6 @@
package scanner
import (
"bufio"
"context"
"io/fs"
"maps"
@ -11,37 +10,69 @@ import (
"strings"
"github.com/navidrome/navidrome/conf"
"github.com/navidrome/navidrome/consts"
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/utils"
ignore "github.com/sabhiram/go-gitignore"
)
func walkDirTree(ctx context.Context, job *scanJob) (<-chan *folderEntry, error) {
// walkDirTree recursively walks the directory tree starting from the given targetFolders.
// If no targetFolders are provided, it starts from the root folder (".").
// It returns a channel of folderEntry pointers representing each folder found.
func walkDirTree(ctx context.Context, job *scanJob, targetFolders ...string) (<-chan *folderEntry, error) {
results := make(chan *folderEntry)
folders := targetFolders
if len(targetFolders) == 0 {
// No specific folders provided, scan the root folder
folders = []string{"."}
}
go func() {
defer close(results)
err := walkFolder(ctx, job, ".", nil, results)
if err != nil {
log.Error(ctx, "Scanner: There were errors reading directories from filesystem", "path", job.lib.Path, err)
return
for _, folderPath := range folders {
if utils.IsCtxDone(ctx) {
return
}
// Check if target folder exists before walking it
// If it doesn't exist (e.g., deleted between watcher detection and scan execution),
// skip it so it remains in job.lastUpdates and gets handled in following steps
_, err := fs.Stat(job.fs, folderPath)
if err != nil {
log.Warn(ctx, "Scanner: Target folder does not exist.", "path", folderPath, err)
continue
}
// Create checker and push patterns from root to this folder
checker := newIgnoreChecker(job.fs)
err = checker.PushAllParents(ctx, folderPath)
if err != nil {
log.Error(ctx, "Scanner: Error pushing ignore patterns for target folder", "path", folderPath, err)
continue
}
// Recursively walk this folder and all its children
err = walkFolder(ctx, job, folderPath, checker, results)
if err != nil {
log.Error(ctx, "Scanner: Error walking target folder", "path", folderPath, err)
continue
}
}
log.Debug(ctx, "Scanner: Finished reading folders", "lib", job.lib.Name, "path", job.lib.Path, "numFolders", job.numFolders.Load())
log.Debug(ctx, "Scanner: Finished reading target folders", "lib", job.lib.Name, "path", job.lib.Path, "numFolders", job.numFolders.Load())
}()
return results, nil
}
func walkFolder(ctx context.Context, job *scanJob, currentFolder string, ignorePatterns []string, results chan<- *folderEntry) error {
ignorePatterns = loadIgnoredPatterns(ctx, job.fs, currentFolder, ignorePatterns)
func walkFolder(ctx context.Context, job *scanJob, currentFolder string, checker *IgnoreChecker, results chan<- *folderEntry) error {
// Push patterns for this folder onto the stack
_ = checker.Push(ctx, currentFolder)
defer checker.Pop() // Pop patterns when leaving this folder
folder, children, err := loadDir(ctx, job, currentFolder, ignorePatterns)
folder, children, err := loadDir(ctx, job, currentFolder, checker)
if err != nil {
log.Warn(ctx, "Scanner: Error loading dir. Skipping", "path", currentFolder, err)
return nil
}
for _, c := range children {
err := walkFolder(ctx, job, c, ignorePatterns, results)
err := walkFolder(ctx, job, c, checker, results)
if err != nil {
return err
}
@ -59,50 +90,17 @@ func walkFolder(ctx context.Context, job *scanJob, currentFolder string, ignoreP
return nil
}
func loadIgnoredPatterns(ctx context.Context, fsys fs.FS, currentFolder string, currentPatterns []string) []string {
ignoreFilePath := path.Join(currentFolder, consts.ScanIgnoreFile)
var newPatterns []string
if _, err := fs.Stat(fsys, ignoreFilePath); err == nil {
// Read and parse the .ndignore file
ignoreFile, err := fsys.Open(ignoreFilePath)
if err != nil {
log.Warn(ctx, "Scanner: Error opening .ndignore file", "path", ignoreFilePath, err)
// Continue with previous patterns
} else {
defer ignoreFile.Close()
scanner := bufio.NewScanner(ignoreFile)
for scanner.Scan() {
line := scanner.Text()
if line == "" || strings.HasPrefix(line, "#") {
continue // Skip empty lines and comments
}
newPatterns = append(newPatterns, line)
}
if err := scanner.Err(); err != nil {
log.Warn(ctx, "Scanner: Error reading .ignore file", "path", ignoreFilePath, err)
}
}
// If the .ndignore file is empty, mimic the current behavior and ignore everything
if len(newPatterns) == 0 {
log.Trace(ctx, "Scanner: .ndignore file is empty, ignoring everything", "path", currentFolder)
newPatterns = []string{"**/*"}
} else {
log.Trace(ctx, "Scanner: .ndignore file found ", "path", ignoreFilePath, "patterns", newPatterns)
}
}
// Combine the patterns from the .ndignore file with the ones passed as argument
combinedPatterns := append([]string{}, currentPatterns...)
return append(combinedPatterns, newPatterns...)
}
func loadDir(ctx context.Context, job *scanJob, dirPath string, ignorePatterns []string) (folder *folderEntry, children []string, err error) {
folder = newFolderEntry(job, dirPath)
func loadDir(ctx context.Context, job *scanJob, dirPath string, checker *IgnoreChecker) (folder *folderEntry, children []string, err error) {
// Check if directory exists before creating the folder entry
// This is important to avoid removing the folder from lastUpdates if it doesn't exist
dirInfo, err := fs.Stat(job.fs, dirPath)
if err != nil {
log.Warn(ctx, "Scanner: Error stating dir", "path", dirPath, err)
return nil, nil, err
}
// Now that we know the folder exists, create the entry (which removes it from lastUpdates)
folder = job.createFolderEntry(dirPath)
folder.modTime = dirInfo.ModTime()
dir, err := job.fs.Open(dirPath)
@ -117,12 +115,11 @@ func loadDir(ctx context.Context, job *scanJob, dirPath string, ignorePatterns [
return folder, children, err
}
ignoreMatcher := ignore.CompileIgnoreLines(ignorePatterns...)
entries := fullReadDir(ctx, dirFile)
children = make([]string, 0, len(entries))
for _, entry := range entries {
entryPath := path.Join(dirPath, entry.Name())
if len(ignorePatterns) > 0 && isScanIgnored(ctx, ignoreMatcher, entryPath) {
if checker.ShouldIgnore(ctx, entryPath) {
log.Trace(ctx, "Scanner: Ignoring entry", "path", entryPath)
continue
}
@ -234,6 +231,7 @@ func isDirReadable(ctx context.Context, fsys fs.FS, dirPath string) bool {
var ignoredDirs = []string{
"$RECYCLE.BIN",
"#snapshot",
"@Recycle",
"@Recently-Snapshot",
".streams",
"lost+found",
@ -254,11 +252,3 @@ func isDirIgnored(name string) bool {
func isEntryIgnored(name string) bool {
return strings.HasPrefix(name, ".") && !strings.HasPrefix(name, "..")
}
func isScanIgnored(ctx context.Context, matcher *ignore.GitIgnore, entryPath string) bool {
matches := matcher.MatchesPath(entryPath)
if matches {
log.Trace(ctx, "Scanner: Ignoring entry matching .ndignore: ", "path", entryPath)
}
return matches
}

View File

@ -25,82 +25,196 @@ var _ = Describe("walk_dir_tree", func() {
ctx context.Context
)
BeforeEach(func() {
DeferCleanup(configtest.SetupConfig())
ctx = GinkgoT().Context()
fsys = &mockMusicFS{
FS: fstest.MapFS{
"root/a/.ndignore": {Data: []byte("ignored/*")},
"root/a/f1.mp3": {},
"root/a/f2.mp3": {},
"root/a/ignored/bad.mp3": {},
"root/b/cover.jpg": {},
"root/c/f3": {},
"root/d": {},
"root/d/.ndignore": {},
"root/d/f1.mp3": {},
"root/d/f2.mp3": {},
"root/d/f3.mp3": {},
"root/e/original/f1.mp3": {},
"root/e/symlink": {Mode: fs.ModeSymlink, Data: []byte("original")},
Context("full library", func() {
BeforeEach(func() {
DeferCleanup(configtest.SetupConfig())
ctx = GinkgoT().Context()
fsys = &mockMusicFS{
FS: fstest.MapFS{
"root/a/.ndignore": {Data: []byte("ignored/*")},
"root/a/f1.mp3": {},
"root/a/f2.mp3": {},
"root/a/ignored/bad.mp3": {},
"root/b/cover.jpg": {},
"root/c/f3": {},
"root/d": {},
"root/d/.ndignore": {},
"root/d/f1.mp3": {},
"root/d/f2.mp3": {},
"root/d/f3.mp3": {},
"root/e/original/f1.mp3": {},
"root/e/symlink": {Mode: fs.ModeSymlink, Data: []byte("original")},
},
}
job = &scanJob{
fs: fsys,
lib: model.Library{Path: "/music"},
}
})
// Helper function to call walkDirTree and collect folders from the results channel
getFolders := func() map[string]*folderEntry {
results, err := walkDirTree(ctx, job)
Expect(err).ToNot(HaveOccurred())
folders := map[string]*folderEntry{}
g := errgroup.Group{}
g.Go(func() error {
for folder := range results {
folders[folder.path] = folder
}
return nil
})
_ = g.Wait()
return folders
}
DescribeTable("symlink handling",
func(followSymlinks bool, expectedFolderCount int) {
conf.Server.Scanner.FollowSymlinks = followSymlinks
folders := getFolders()
Expect(folders).To(HaveLen(expectedFolderCount + 2)) // +2 for `.` and `root`
// Basic folder structure checks
Expect(folders["root/a"].audioFiles).To(SatisfyAll(
HaveLen(2),
HaveKey("f1.mp3"),
HaveKey("f2.mp3"),
))
Expect(folders["root/a"].imageFiles).To(BeEmpty())
Expect(folders["root/b"].audioFiles).To(BeEmpty())
Expect(folders["root/b"].imageFiles).To(SatisfyAll(
HaveLen(1),
HaveKey("cover.jpg"),
))
Expect(folders["root/c"].audioFiles).To(BeEmpty())
Expect(folders["root/c"].imageFiles).To(BeEmpty())
Expect(folders).ToNot(HaveKey("root/d"))
// Symlink specific checks
if followSymlinks {
Expect(folders["root/e/symlink"].audioFiles).To(HaveLen(1))
} else {
Expect(folders).ToNot(HaveKey("root/e/symlink"))
}
},
}
job = &scanJob{
fs: fsys,
lib: model.Library{Path: "/music"},
}
Entry("with symlinks enabled", true, 7),
Entry("with symlinks disabled", false, 6),
)
})
// Helper function to call walkDirTree and collect folders from the results channel
getFolders := func() map[string]*folderEntry {
results, err := walkDirTree(ctx, job)
Expect(err).ToNot(HaveOccurred())
folders := map[string]*folderEntry{}
g := errgroup.Group{}
g.Go(func() error {
for folder := range results {
folders[folder.path] = folder
Context("with target folders", func() {
BeforeEach(func() {
DeferCleanup(configtest.SetupConfig())
ctx = GinkgoT().Context()
fsys = &mockMusicFS{
FS: fstest.MapFS{
"Artist/Album1/track1.mp3": {},
"Artist/Album1/track2.mp3": {},
"Artist/Album2/track1.mp3": {},
"Artist/Album2/track2.mp3": {},
"Artist/Album2/Sub/track3.mp3": {},
"OtherArtist/Album3/track1.mp3": {},
},
}
job = &scanJob{
fs: fsys,
lib: model.Library{Path: "/music"},
}
return nil
})
_ = g.Wait()
return folders
}
DescribeTable("symlink handling",
func(followSymlinks bool, expectedFolderCount int) {
conf.Server.Scanner.FollowSymlinks = followSymlinks
folders := getFolders()
It("should recursively walk all subdirectories of target folders", func() {
results, err := walkDirTree(ctx, job, "Artist")
Expect(err).ToNot(HaveOccurred())
Expect(folders).To(HaveLen(expectedFolderCount + 2)) // +2 for `.` and `root`
folders := map[string]*folderEntry{}
g := errgroup.Group{}
g.Go(func() error {
for folder := range results {
folders[folder.path] = folder
}
return nil
})
_ = g.Wait()
// Basic folder structure checks
Expect(folders["root/a"].audioFiles).To(SatisfyAll(
HaveLen(2),
HaveKey("f1.mp3"),
HaveKey("f2.mp3"),
// Should include the target folder and all its descendants
Expect(folders).To(SatisfyAll(
HaveKey("Artist"),
HaveKey("Artist/Album1"),
HaveKey("Artist/Album2"),
HaveKey("Artist/Album2/Sub"),
))
Expect(folders["root/a"].imageFiles).To(BeEmpty())
Expect(folders["root/b"].audioFiles).To(BeEmpty())
Expect(folders["root/b"].imageFiles).To(SatisfyAll(
HaveLen(1),
HaveKey("cover.jpg"),
))
Expect(folders["root/c"].audioFiles).To(BeEmpty())
Expect(folders["root/c"].imageFiles).To(BeEmpty())
Expect(folders).ToNot(HaveKey("root/d"))
// Symlink specific checks
if followSymlinks {
Expect(folders["root/e/symlink"].audioFiles).To(HaveLen(1))
} else {
Expect(folders).ToNot(HaveKey("root/e/symlink"))
// Should not include folders outside the target
Expect(folders).ToNot(HaveKey("OtherArtist"))
Expect(folders).ToNot(HaveKey("OtherArtist/Album3"))
// Verify audio files are present
Expect(folders["Artist/Album1"].audioFiles).To(HaveLen(2))
Expect(folders["Artist/Album2"].audioFiles).To(HaveLen(2))
Expect(folders["Artist/Album2/Sub"].audioFiles).To(HaveLen(1))
})
It("should handle multiple target folders", func() {
results, err := walkDirTree(ctx, job, "Artist/Album1", "OtherArtist")
Expect(err).ToNot(HaveOccurred())
folders := map[string]*folderEntry{}
g := errgroup.Group{}
g.Go(func() error {
for folder := range results {
folders[folder.path] = folder
}
return nil
})
_ = g.Wait()
// Should include both target folders and their descendants
Expect(folders).To(SatisfyAll(
HaveKey("Artist/Album1"),
HaveKey("OtherArtist"),
HaveKey("OtherArtist/Album3"),
))
// Should not include other folders
Expect(folders).ToNot(HaveKey("Artist"))
Expect(folders).ToNot(HaveKey("Artist/Album2"))
Expect(folders).ToNot(HaveKey("Artist/Album2/Sub"))
})
It("should skip non-existent target folders and preserve them in lastUpdates", func() {
// Setup job with lastUpdates for both existing and non-existing folders
job.lastUpdates = map[string]model.FolderUpdateInfo{
model.FolderID(job.lib, "Artist/Album1"): {},
model.FolderID(job.lib, "NonExistent/DeletedFolder"): {},
model.FolderID(job.lib, "OtherArtist/Album3"): {},
}
},
Entry("with symlinks enabled", true, 7),
Entry("with symlinks disabled", false, 6),
)
// Try to scan existing folder and non-existing folder
results, err := walkDirTree(ctx, job, "Artist/Album1", "NonExistent/DeletedFolder")
Expect(err).ToNot(HaveOccurred())
// Collect results
folders := map[string]struct{}{}
for folder := range results {
folders[folder.path] = struct{}{}
}
// Should only include the existing folder
Expect(folders).To(HaveKey("Artist/Album1"))
Expect(folders).ToNot(HaveKey("NonExistent/DeletedFolder"))
// The non-existent folder should still be in lastUpdates (not removed by popLastUpdate)
Expect(job.lastUpdates).To(HaveKey(model.FolderID(job.lib, "NonExistent/DeletedFolder")))
// The existing folder should have been removed from lastUpdates
Expect(job.lastUpdates).ToNot(HaveKey(model.FolderID(job.lib, "Artist/Album1")))
// Folders not in targets should remain in lastUpdates
Expect(job.lastUpdates).To(HaveKey(model.FolderID(job.lib, "OtherArtist/Album3")))
})
})
})
Describe("helper functions", func() {

View File

@ -24,9 +24,9 @@ type Watcher interface {
type watcher struct {
mainCtx context.Context
ds model.DataStore
scanner Scanner
scanner model.Scanner
triggerWait time.Duration
watcherNotify chan model.Library
watcherNotify chan scanNotification
libraryWatchers map[int]*libraryWatcherInstance
mu sync.RWMutex
}
@ -36,14 +36,19 @@ type libraryWatcherInstance struct {
cancel context.CancelFunc
}
type scanNotification struct {
Library *model.Library
FolderPath string
}
// GetWatcher returns the watcher singleton
func GetWatcher(ds model.DataStore, s Scanner) Watcher {
func GetWatcher(ds model.DataStore, s model.Scanner) Watcher {
return singleton.GetInstance(func() *watcher {
return &watcher{
ds: ds,
scanner: s,
triggerWait: conf.Server.Scanner.WatcherWait,
watcherNotify: make(chan model.Library, 1),
watcherNotify: make(chan scanNotification, 1),
libraryWatchers: make(map[int]*libraryWatcherInstance),
}
})
@ -68,11 +73,11 @@ func (w *watcher) Run(ctx context.Context) error {
// Main scan triggering loop
trigger := time.NewTimer(w.triggerWait)
trigger.Stop()
waiting := false
targets := make(map[model.ScanTarget]struct{})
for {
select {
case <-trigger.C:
log.Info("Watcher: Triggering scan")
log.Info("Watcher: Triggering scan for changed folders", "numTargets", len(targets))
status, err := w.scanner.Status(ctx)
if err != nil {
log.Error(ctx, "Watcher: Error retrieving Scanner status", err)
@ -83,9 +88,23 @@ func (w *watcher) Run(ctx context.Context) error {
trigger.Reset(w.triggerWait * 3)
continue
}
waiting = false
// Convert targets map to slice
targetSlice := make([]model.ScanTarget, 0, len(targets))
for target := range targets {
targetSlice = append(targetSlice, target)
}
// Clear targets for next batch
targets = make(map[model.ScanTarget]struct{})
go func() {
_, err := w.scanner.ScanAll(ctx, false)
var err error
if conf.Server.DevSelectiveWatcher {
_, err = w.scanner.ScanFolders(ctx, false, targetSlice)
} else {
_, err = w.scanner.ScanAll(ctx, false)
}
if err != nil {
log.Error(ctx, "Watcher: Error scanning", err)
} else {
@ -102,13 +121,20 @@ func (w *watcher) Run(ctx context.Context) error {
w.libraryWatchers = make(map[int]*libraryWatcherInstance)
w.mu.Unlock()
return nil
case lib := <-w.watcherNotify:
if !waiting {
log.Debug(ctx, "Watcher: Detected changes. Waiting for more changes before triggering scan",
"libraryID", lib.ID, "name", lib.Name, "path", lib.Path)
waiting = true
case notification := <-w.watcherNotify:
lib := notification.Library
folderPath := notification.FolderPath
// If already scheduled for scan, skip
target := model.ScanTarget{LibraryID: lib.ID, FolderPath: folderPath}
if _, exists := targets[target]; exists {
continue
}
targets[target] = struct{}{}
trigger.Reset(w.triggerWait)
log.Debug(ctx, "Watcher: Detected changes. Waiting for more changes before triggering scan",
"libraryID", lib.ID, "name", lib.Name, "path", lib.Path, "folderPath", folderPath)
}
}
}
@ -199,13 +225,18 @@ func (w *watcher) watchLibrary(ctx context.Context, lib *model.Library) error {
log.Info(ctx, "Watcher started for library", "libraryID", lib.ID, "name", lib.Name, "path", lib.Path, "absoluteLibPath", absLibPath)
return w.processLibraryEvents(ctx, lib, fsys, c, absLibPath)
}
// processLibraryEvents processes filesystem events for a library.
func (w *watcher) processLibraryEvents(ctx context.Context, lib *model.Library, fsys storage.MusicFS, events <-chan string, absLibPath string) error {
for {
select {
case <-ctx.Done():
log.Debug(ctx, "Watcher stopped due to context cancellation", "libraryID", lib.ID, "name", lib.Name)
return nil
case path := <-c:
path, err = filepath.Rel(absLibPath, path)
case path := <-events:
path, err := filepath.Rel(absLibPath, path)
if err != nil {
log.Error(ctx, "Error getting relative path", "libraryID", lib.ID, "absolutePath", absLibPath, "path", path, err)
continue
@ -215,12 +246,27 @@ func (w *watcher) watchLibrary(ctx context.Context, lib *model.Library) error {
log.Trace(ctx, "Ignoring change", "libraryID", lib.ID, "path", path)
continue
}
log.Trace(ctx, "Detected change", "libraryID", lib.ID, "path", path, "absoluteLibPath", absLibPath)
// Check if the original path (before resolution) matches .ndignore patterns
// This is crucial for deleted folders - if a deleted folder matches .ndignore,
// we should ignore it BEFORE resolveFolderPath walks up to the parent
if w.shouldIgnoreFolderPath(ctx, fsys, path) {
log.Debug(ctx, "Ignoring change matching .ndignore pattern", "libraryID", lib.ID, "path", path)
continue
}
// Find the folder to scan - validate path exists as directory, walk up if needed
folderPath := resolveFolderPath(fsys, path)
// Double-check after resolution in case the resolved path is different and also matches patterns
if folderPath != path && w.shouldIgnoreFolderPath(ctx, fsys, folderPath) {
log.Trace(ctx, "Ignoring change in folder matching .ndignore pattern", "libraryID", lib.ID, "folderPath", folderPath)
continue
}
// Notify the main watcher of changes
select {
case w.watcherNotify <- *lib:
case w.watcherNotify <- scanNotification{Library: lib, FolderPath: folderPath}:
default:
// Channel is full, notification already pending
}
@ -228,6 +274,47 @@ func (w *watcher) watchLibrary(ctx context.Context, lib *model.Library) error {
}
}
// resolveFolderPath takes a path (which may be a file or directory) and returns
// the folder path to scan. If the path is a file, it walks up to find the parent
// directory. Returns empty string if the path should scan the library root.
func resolveFolderPath(fsys fs.FS, path string) string {
// Handle root paths immediately
if path == "." || path == "" {
return ""
}
folderPath := path
for {
info, err := fs.Stat(fsys, folderPath)
if err == nil && info.IsDir() {
// Found a valid directory
return folderPath
}
if folderPath == "." || folderPath == "" {
// Reached root, scan entire library
return ""
}
// Walk up the tree
dir, _ := filepath.Split(folderPath)
if dir == "" || dir == "." {
return ""
}
// Remove trailing slash
folderPath = filepath.Clean(dir)
}
}
// shouldIgnoreFolderPath checks if the given folderPath should be ignored based on .ndignore patterns
// in the library. It pushes all parent folders onto the IgnoreChecker stack before checking.
func (w *watcher) shouldIgnoreFolderPath(ctx context.Context, fsys storage.MusicFS, folderPath string) bool {
checker := newIgnoreChecker(fsys)
err := checker.PushAllParents(ctx, folderPath)
if err != nil {
log.Warn(ctx, "Watcher: Error pushing ignore patterns for folder", "path", folderPath, err)
}
return checker.ShouldIgnore(ctx, folderPath)
}
func isIgnoredPath(_ context.Context, _ fs.FS, path string) bool {
baseDir, name := filepath.Split(path)
switch {

491
scanner/watcher_test.go Normal file
View File

@ -0,0 +1,491 @@
package scanner
import (
"context"
"io/fs"
"path/filepath"
"testing/fstest"
"time"
"github.com/navidrome/navidrome/conf"
"github.com/navidrome/navidrome/conf/configtest"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/tests"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("Watcher", func() {
var ctx context.Context
var cancel context.CancelFunc
var mockScanner *tests.MockScanner
var mockDS *tests.MockDataStore
var w *watcher
var lib *model.Library
BeforeEach(func() {
DeferCleanup(configtest.SetupConfig())
conf.Server.Scanner.WatcherWait = 50 * time.Millisecond // Short wait for tests
ctx, cancel = context.WithCancel(context.Background())
DeferCleanup(cancel)
lib = &model.Library{
ID: 1,
Name: "Test Library",
Path: "/test/library",
}
// Set up mocks
mockScanner = tests.NewMockScanner()
mockDS = &tests.MockDataStore{}
mockLibRepo := &tests.MockLibraryRepo{}
mockLibRepo.SetData(model.Libraries{*lib})
mockDS.MockedLibrary = mockLibRepo
// Create a new watcher instance (not singleton) for testing
w = &watcher{
ds: mockDS,
scanner: mockScanner,
triggerWait: conf.Server.Scanner.WatcherWait,
watcherNotify: make(chan scanNotification, 10),
libraryWatchers: make(map[int]*libraryWatcherInstance),
mainCtx: ctx,
}
})
Describe("Target Collection and Deduplication", func() {
BeforeEach(func() {
// Start watcher in background
go func() {
_ = w.Run(ctx)
}()
// Give watcher time to initialize
time.Sleep(10 * time.Millisecond)
})
It("creates separate targets for different folders", func() {
// Send notifications for different folders
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1"}
time.Sleep(10 * time.Millisecond)
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist2"}
// Wait for watcher to process and trigger scan
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
// Verify two targets
calls := mockScanner.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].Targets).To(HaveLen(2))
// Extract folder paths
folderPaths := make(map[string]bool)
for _, target := range calls[0].Targets {
Expect(target.LibraryID).To(Equal(1))
folderPaths[target.FolderPath] = true
}
Expect(folderPaths).To(HaveKey("artist1"))
Expect(folderPaths).To(HaveKey("artist2"))
})
It("handles different folder paths correctly", func() {
// Send notification for nested folder
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1/album1"}
// Wait for watcher to process and trigger scan
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
// Verify the target
calls := mockScanner.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].Targets).To(HaveLen(1))
Expect(calls[0].Targets[0].FolderPath).To(Equal("artist1/album1"))
})
It("deduplicates folder and file within same folder", func() {
// Send notification for a folder
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1/album1"}
time.Sleep(10 * time.Millisecond)
// Send notification for same folder (as if file change was detected there)
// In practice, watchLibrary() would walk up from file path to folder
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1/album1"}
time.Sleep(10 * time.Millisecond)
// Send another for same folder
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1/album1"}
// Wait for watcher to process and trigger scan
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
// Verify only one target despite multiple file/folder changes
calls := mockScanner.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].Targets).To(HaveLen(1))
Expect(calls[0].Targets[0].FolderPath).To(Equal("artist1/album1"))
})
})
Describe("Timer Behavior", func() {
BeforeEach(func() {
// Start watcher in background
go func() {
_ = w.Run(ctx)
}()
// Give watcher time to initialize
time.Sleep(10 * time.Millisecond)
})
It("resets timer on each change (debouncing)", func() {
// Send first notification
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1"}
// Wait a bit less than half the watcher wait time to ensure timer doesn't fire
time.Sleep(20 * time.Millisecond)
// No scan should have been triggered yet
Expect(mockScanner.GetScanFoldersCallCount()).To(Equal(0))
// Send another notification (resets timer)
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1"}
// Wait a bit less than half the watcher wait time again
time.Sleep(20 * time.Millisecond)
// Still no scan
Expect(mockScanner.GetScanFoldersCallCount()).To(Equal(0))
// Wait for full timer to expire after last notification (plus margin)
time.Sleep(60 * time.Millisecond)
// Now scan should have been triggered
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 100*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
})
It("triggers scan after quiet period", func() {
// Send notification
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1"}
// No scan immediately
Expect(mockScanner.GetScanFoldersCallCount()).To(Equal(0))
// Wait for quiet period
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
})
})
Describe("Empty and Root Paths", func() {
BeforeEach(func() {
// Start watcher in background
go func() {
_ = w.Run(ctx)
}()
// Give watcher time to initialize
time.Sleep(10 * time.Millisecond)
})
It("handles empty folder path (library root)", func() {
// Send notification with empty folder path
w.watcherNotify <- scanNotification{Library: lib, FolderPath: ""}
// Wait for scan
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
// Should scan the library root
calls := mockScanner.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].Targets).To(HaveLen(1))
Expect(calls[0].Targets[0].FolderPath).To(Equal(""))
})
It("deduplicates empty and dot paths", func() {
// Send notifications with empty and dot paths
w.watcherNotify <- scanNotification{Library: lib, FolderPath: ""}
time.Sleep(10 * time.Millisecond)
w.watcherNotify <- scanNotification{Library: lib, FolderPath: ""}
// Wait for scan
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
// Should have only one target
calls := mockScanner.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].Targets).To(HaveLen(1))
})
})
Describe("Multiple Libraries", func() {
var lib2 *model.Library
BeforeEach(func() {
// Create second library
lib2 = &model.Library{
ID: 2,
Name: "Test Library 2",
Path: "/test/library2",
}
mockLibRepo := mockDS.MockedLibrary.(*tests.MockLibraryRepo)
mockLibRepo.SetData(model.Libraries{*lib, *lib2})
// Start watcher in background
go func() {
_ = w.Run(ctx)
}()
// Give watcher time to initialize
time.Sleep(10 * time.Millisecond)
})
It("creates separate targets for different libraries", func() {
// Send notifications for both libraries
w.watcherNotify <- scanNotification{Library: lib, FolderPath: "artist1"}
time.Sleep(10 * time.Millisecond)
w.watcherNotify <- scanNotification{Library: lib2, FolderPath: "artist2"}
// Wait for scan
Eventually(func() int {
return mockScanner.GetScanFoldersCallCount()
}, 200*time.Millisecond, 10*time.Millisecond).Should(Equal(1))
// Verify two targets for different libraries
calls := mockScanner.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].Targets).To(HaveLen(2))
// Verify library IDs are different
libraryIDs := make(map[int]bool)
for _, target := range calls[0].Targets {
libraryIDs[target.LibraryID] = true
}
Expect(libraryIDs).To(HaveKey(1))
Expect(libraryIDs).To(HaveKey(2))
})
})
Describe(".ndignore handling", func() {
var ctx context.Context
var cancel context.CancelFunc
var w *watcher
var mockFS *mockMusicFS
var lib *model.Library
var eventChan chan string
var absLibPath string
BeforeEach(func() {
ctx, cancel = context.WithCancel(GinkgoT().Context())
DeferCleanup(cancel)
// Set up library
var err error
absLibPath, err = filepath.Abs(".")
Expect(err).NotTo(HaveOccurred())
lib = &model.Library{
ID: 1,
Name: "Test Library",
Path: absLibPath,
}
// Create watcher with notification channel
w = &watcher{
watcherNotify: make(chan scanNotification, 10),
}
eventChan = make(chan string, 10)
})
// Helper to send an event - converts relative path to absolute
sendEvent := func(relativePath string) {
path := filepath.Join(absLibPath, relativePath)
eventChan <- path
}
// Helper to start the real event processing loop
startEventProcessing := func() {
go func() {
defer GinkgoRecover()
// Call the actual processLibraryEvents method - testing the real implementation!
_ = w.processLibraryEvents(ctx, lib, mockFS, eventChan, absLibPath)
}()
}
Context("when a folder matching .ndignore is deleted", func() {
BeforeEach(func() {
// Create filesystem with .ndignore containing _TEMP pattern
// The deleted folder (_TEMP) will NOT exist in the filesystem
mockFS = &mockMusicFS{
FS: fstest.MapFS{
"rock": &fstest.MapFile{Mode: fs.ModeDir},
"rock/.ndignore": &fstest.MapFile{Data: []byte("_TEMP\n")},
"rock/valid_album": &fstest.MapFile{Mode: fs.ModeDir},
"rock/valid_album/track.mp3": &fstest.MapFile{Data: []byte("audio")},
},
}
})
It("should NOT send scan notification when deleted folder matches .ndignore", func() {
startEventProcessing()
// Simulate deletion event for rock/_TEMP
sendEvent("rock/_TEMP")
// Wait a bit to ensure event is processed
time.Sleep(50 * time.Millisecond)
// No notification should have been sent
Consistently(eventChan, 100*time.Millisecond).Should(BeEmpty())
})
It("should send scan notification for valid folder deletion", func() {
startEventProcessing()
// Simulate deletion event for rock/other_folder (not in .ndignore and doesn't exist)
// Since it doesn't exist in mockFS, resolveFolderPath will walk up to "rock"
sendEvent("rock/other_folder")
// Should receive notification for parent folder
Eventually(w.watcherNotify, 200*time.Millisecond).Should(Receive(Equal(scanNotification{
Library: lib,
FolderPath: "rock",
})))
})
})
Context("with nested folder patterns", func() {
BeforeEach(func() {
mockFS = &mockMusicFS{
FS: fstest.MapFS{
"music": &fstest.MapFile{Mode: fs.ModeDir},
"music/.ndignore": &fstest.MapFile{Data: []byte("**/temp\n**/cache\n")},
"music/rock": &fstest.MapFile{Mode: fs.ModeDir},
"music/rock/artist": &fstest.MapFile{Mode: fs.ModeDir},
},
}
})
It("should NOT send notification when nested ignored folder is deleted", func() {
startEventProcessing()
// Simulate deletion of music/rock/artist/temp (matches **/temp)
sendEvent("music/rock/artist/temp")
// Wait to ensure event is processed
time.Sleep(50 * time.Millisecond)
// No notification should be sent
Expect(w.watcherNotify).To(BeEmpty(), "Expected no scan notification for nested ignored folder")
})
It("should send notification for non-ignored nested folder", func() {
startEventProcessing()
// Simulate change in music/rock/artist (doesn't match any pattern)
sendEvent("music/rock/artist")
// Should receive notification
Eventually(w.watcherNotify, 200*time.Millisecond).Should(Receive(Equal(scanNotification{
Library: lib,
FolderPath: "music/rock/artist",
})))
})
})
Context("with file events in ignored folders", func() {
BeforeEach(func() {
mockFS = &mockMusicFS{
FS: fstest.MapFS{
"rock": &fstest.MapFile{Mode: fs.ModeDir},
"rock/.ndignore": &fstest.MapFile{Data: []byte("_TEMP\n")},
},
}
})
It("should NOT send notification for file changes in ignored folders", func() {
startEventProcessing()
// Simulate file change in rock/_TEMP/file.mp3
sendEvent("rock/_TEMP/file.mp3")
// Wait to ensure event is processed
time.Sleep(50 * time.Millisecond)
// No notification should be sent
Expect(w.watcherNotify).To(BeEmpty(), "Expected no scan notification for file in ignored folder")
})
})
})
})
var _ = Describe("resolveFolderPath", func() {
var mockFS fs.FS
BeforeEach(func() {
// Create a mock filesystem with some directories and files
mockFS = fstest.MapFS{
"artist1": &fstest.MapFile{Mode: fs.ModeDir},
"artist1/album1": &fstest.MapFile{Mode: fs.ModeDir},
"artist1/album1/track1.mp3": &fstest.MapFile{Data: []byte("audio")},
"artist1/album1/track2.mp3": &fstest.MapFile{Data: []byte("audio")},
"artist1/album2": &fstest.MapFile{Mode: fs.ModeDir},
"artist1/album2/song.flac": &fstest.MapFile{Data: []byte("audio")},
"artist2": &fstest.MapFile{Mode: fs.ModeDir},
"artist2/cover.jpg": &fstest.MapFile{Data: []byte("image")},
}
})
It("returns directory path when given a directory", func() {
result := resolveFolderPath(mockFS, "artist1/album1")
Expect(result).To(Equal("artist1/album1"))
})
It("walks up to parent directory when given a file path", func() {
result := resolveFolderPath(mockFS, "artist1/album1/track1.mp3")
Expect(result).To(Equal("artist1/album1"))
})
It("walks up multiple levels if needed", func() {
result := resolveFolderPath(mockFS, "artist1/album1/nonexistent/file.mp3")
Expect(result).To(Equal("artist1/album1"))
})
It("returns empty string for non-existent paths at root", func() {
result := resolveFolderPath(mockFS, "nonexistent/path/file.mp3")
Expect(result).To(Equal(""))
})
It("returns empty string for dot path", func() {
result := resolveFolderPath(mockFS, ".")
Expect(result).To(Equal(""))
})
It("returns empty string for empty path", func() {
result := resolveFolderPath(mockFS, "")
Expect(result).To(Equal(""))
})
It("handles nested file paths correctly", func() {
result := resolveFolderPath(mockFS, "artist1/album2/song.flac")
Expect(result).To(Equal("artist1/album2"))
})
It("resolves to top-level directory", func() {
result := resolveFolderPath(mockFS, "artist2/cover.jpg")
Expect(result).To(Equal("artist2"))
})
})

View File

@ -18,7 +18,6 @@ import (
"github.com/navidrome/navidrome/core/scrobbler"
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/scanner"
"github.com/navidrome/navidrome/server"
"github.com/navidrome/navidrome/server/events"
"github.com/navidrome/navidrome/server/subsonic/responses"
@ -39,7 +38,7 @@ type Router struct {
players core.Players
provider external.Provider
playlists core.Playlists
scanner scanner.Scanner
scanner model.Scanner
broker events.Broker
scrobbler scrobbler.PlayTracker
share core.Share
@ -48,7 +47,7 @@ type Router struct {
}
func New(ds model.DataStore, artwork artwork.Artwork, streamer core.MediaStreamer, archiver core.Archiver,
players core.Players, provider external.Provider, scanner scanner.Scanner, broker events.Broker,
players core.Players, provider external.Provider, scanner model.Scanner, broker events.Broker,
playlists core.Playlists, scrobbler scrobbler.PlayTracker, share core.Share, playback playback.PlaybackServer,
metrics metrics.Metrics,
) *Router {

View File

@ -1,10 +1,13 @@
package subsonic
import (
"fmt"
"net/http"
"slices"
"time"
"github.com/navidrome/navidrome/log"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/model/request"
"github.com/navidrome/navidrome/server/subsonic/responses"
"github.com/navidrome/navidrome/utils/req"
@ -44,15 +47,56 @@ func (api *Router) StartScan(r *http.Request) (*responses.Subsonic, error) {
p := req.Params(r)
fullScan := p.BoolOr("fullScan", false)
// Parse optional target parameters for selective scanning
var targets []model.ScanTarget
if targetParams, err := p.Strings("target"); err == nil && len(targetParams) > 0 {
targets, err = model.ParseTargets(targetParams)
if err != nil {
return nil, newError(responses.ErrorGeneric, fmt.Sprintf("Invalid target parameter: %v", err))
}
// Validate all libraries in targets exist and user has access to them
userLibraries, err := api.ds.User(ctx).GetUserLibraries(loggedUser.ID)
if err != nil {
return nil, newError(responses.ErrorGeneric, "Internal error")
}
// Check each target library
for _, target := range targets {
if !slices.ContainsFunc(userLibraries, func(lib model.Library) bool { return lib.ID == target.LibraryID }) {
return nil, newError(responses.ErrorDataNotFound, fmt.Sprintf("Library with ID %d not found", target.LibraryID))
}
}
// Special case: if single library with empty path and it's the only library in DB, call ScanAll
if len(targets) == 1 && targets[0].FolderPath == "" {
allLibs, err := api.ds.Library(ctx).GetAll()
if err != nil {
return nil, newError(responses.ErrorGeneric, "Internal error")
}
if len(allLibs) == 1 {
targets = nil // This will trigger ScanAll below
}
}
}
go func() {
start := time.Now()
log.Info(ctx, "Triggering manual scan", "fullScan", fullScan, "user", loggedUser.UserName)
_, err := api.scanner.ScanAll(ctx, fullScan)
var err error
if len(targets) > 0 {
log.Info(ctx, "Triggering on-demand scan", "fullScan", fullScan, "targets", len(targets), "user", loggedUser.UserName)
_, err = api.scanner.ScanFolders(ctx, fullScan, targets)
} else {
log.Info(ctx, "Triggering on-demand scan", "fullScan", fullScan, "user", loggedUser.UserName)
_, err = api.scanner.ScanAll(ctx, fullScan)
}
if err != nil {
log.Error(ctx, "Error scanning", err)
return
}
log.Info(ctx, "Manual scan complete", "user", loggedUser.UserName, "elapsed", time.Since(start))
log.Info(ctx, "On-demand scan complete", "user", loggedUser.UserName, "elapsed", time.Since(start))
}()
return api.GetScanStatus(r)

View File

@ -0,0 +1,396 @@
package subsonic
import (
"context"
"errors"
"net/http/httptest"
"github.com/navidrome/navidrome/model"
"github.com/navidrome/navidrome/model/request"
"github.com/navidrome/navidrome/server/subsonic/responses"
"github.com/navidrome/navidrome/tests"
. "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega"
)
var _ = Describe("LibraryScanning", func() {
var api *Router
var ms *tests.MockScanner
BeforeEach(func() {
ms = tests.NewMockScanner()
api = &Router{scanner: ms}
})
Describe("StartScan", func() {
It("requires admin authentication", func() {
// Create non-admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "user-id",
IsAdmin: false,
})
// Create request
r := httptest.NewRequest("GET", "/rest/startScan", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should return authorization error
Expect(err).To(HaveOccurred())
Expect(response).To(BeNil())
var subErr subError
ok := errors.As(err, &subErr)
Expect(ok).To(BeTrue())
Expect(subErr.code).To(Equal(responses.ErrorAuthorizationFail))
})
It("triggers a full scan with no parameters", func() {
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with no parameters
r := httptest.NewRequest("GET", "/rest/startScan", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanAll was called (eventually, since it's in a goroutine)
Eventually(func() int {
return ms.GetScanAllCallCount()
}).Should(BeNumerically(">", 0))
calls := ms.GetScanAllCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].FullScan).To(BeFalse())
})
It("triggers a full scan with fullScan=true", func() {
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with fullScan parameter
r := httptest.NewRequest("GET", "/rest/startScan?fullScan=true", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanAll was called with fullScan=true
Eventually(func() int {
return ms.GetScanAllCallCount()
}).Should(BeNumerically(">", 0))
calls := ms.GetScanAllCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].FullScan).To(BeTrue())
})
It("triggers a selective scan with single target parameter", func() {
// Setup mocks
mockUserRepo := tests.CreateMockUserRepo()
_ = mockUserRepo.SetUserLibraries("admin-id", []int{1, 2})
mockDS := &tests.MockDataStore{MockedUser: mockUserRepo}
api.ds = mockDS
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with single target parameter
r := httptest.NewRequest("GET", "/rest/startScan?target=1:Music/Rock", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanFolders was called with correct targets
Eventually(func() int {
return ms.GetScanFoldersCallCount()
}).Should(BeNumerically(">", 0))
calls := ms.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
targets := calls[0].Targets
Expect(targets).To(HaveLen(1))
Expect(targets[0].LibraryID).To(Equal(1))
Expect(targets[0].FolderPath).To(Equal("Music/Rock"))
})
It("triggers a selective scan with multiple target parameters", func() {
// Setup mocks
mockUserRepo := tests.CreateMockUserRepo()
_ = mockUserRepo.SetUserLibraries("admin-id", []int{1, 2})
mockDS := &tests.MockDataStore{MockedUser: mockUserRepo}
api.ds = mockDS
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with multiple target parameters
r := httptest.NewRequest("GET", "/rest/startScan?target=1:Music/Reggae&target=2:Classical/Bach", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanFolders was called with correct targets
Eventually(func() int {
return ms.GetScanFoldersCallCount()
}).Should(BeNumerically(">", 0))
calls := ms.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
targets := calls[0].Targets
Expect(targets).To(HaveLen(2))
Expect(targets[0].LibraryID).To(Equal(1))
Expect(targets[0].FolderPath).To(Equal("Music/Reggae"))
Expect(targets[1].LibraryID).To(Equal(2))
Expect(targets[1].FolderPath).To(Equal("Classical/Bach"))
})
It("triggers a selective full scan with target and fullScan parameters", func() {
// Setup mocks
mockUserRepo := tests.CreateMockUserRepo()
_ = mockUserRepo.SetUserLibraries("admin-id", []int{1})
mockDS := &tests.MockDataStore{MockedUser: mockUserRepo}
api.ds = mockDS
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with target and fullScan parameters
r := httptest.NewRequest("GET", "/rest/startScan?target=1:Music/Jazz&fullScan=true", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanFolders was called with fullScan=true
Eventually(func() int {
return ms.GetScanFoldersCallCount()
}).Should(BeNumerically(">", 0))
calls := ms.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
Expect(calls[0].FullScan).To(BeTrue())
targets := calls[0].Targets
Expect(targets).To(HaveLen(1))
})
It("returns error for invalid target format", func() {
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with invalid target format (missing colon)
r := httptest.NewRequest("GET", "/rest/startScan?target=1MusicRock", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should return error
Expect(err).To(HaveOccurred())
Expect(response).To(BeNil())
var subErr subError
ok := errors.As(err, &subErr)
Expect(ok).To(BeTrue())
Expect(subErr.code).To(Equal(responses.ErrorGeneric))
})
It("returns error for invalid library ID in target", func() {
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with invalid library ID
r := httptest.NewRequest("GET", "/rest/startScan?target=0:Music/Rock", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should return error
Expect(err).To(HaveOccurred())
Expect(response).To(BeNil())
var subErr subError
ok := errors.As(err, &subErr)
Expect(ok).To(BeTrue())
Expect(subErr.code).To(Equal(responses.ErrorGeneric))
})
It("returns error when library does not exist", func() {
// Setup mocks - user has access to library 1 and 2 only
mockUserRepo := tests.CreateMockUserRepo()
_ = mockUserRepo.SetUserLibraries("admin-id", []int{1, 2})
mockDS := &tests.MockDataStore{MockedUser: mockUserRepo}
api.ds = mockDS
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with library ID that doesn't exist
r := httptest.NewRequest("GET", "/rest/startScan?target=999:Music/Rock", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should return ErrorDataNotFound
Expect(err).To(HaveOccurred())
Expect(response).To(BeNil())
var subErr subError
ok := errors.As(err, &subErr)
Expect(ok).To(BeTrue())
Expect(subErr.code).To(Equal(responses.ErrorDataNotFound))
})
It("calls ScanAll when single library with empty path and only one library exists", func() {
// Setup mocks - single library in DB
mockUserRepo := tests.CreateMockUserRepo()
_ = mockUserRepo.SetUserLibraries("admin-id", []int{1})
mockLibraryRepo := &tests.MockLibraryRepo{}
mockLibraryRepo.SetData(model.Libraries{
{ID: 1, Name: "Music Library", Path: "/music"},
})
mockDS := &tests.MockDataStore{
MockedUser: mockUserRepo,
MockedLibrary: mockLibraryRepo,
}
api.ds = mockDS
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with single library and empty path
r := httptest.NewRequest("GET", "/rest/startScan?target=1:", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanAll was called instead of ScanFolders
Eventually(func() int {
return ms.GetScanAllCallCount()
}).Should(BeNumerically(">", 0))
Expect(ms.GetScanFoldersCallCount()).To(Equal(0))
})
It("calls ScanFolders when single library with empty path but multiple libraries exist", func() {
// Setup mocks - multiple libraries in DB
mockUserRepo := tests.CreateMockUserRepo()
_ = mockUserRepo.SetUserLibraries("admin-id", []int{1, 2})
mockLibraryRepo := &tests.MockLibraryRepo{}
mockLibraryRepo.SetData(model.Libraries{
{ID: 1, Name: "Music Library", Path: "/music"},
{ID: 2, Name: "Audiobooks", Path: "/audiobooks"},
})
mockDS := &tests.MockDataStore{
MockedUser: mockUserRepo,
MockedLibrary: mockLibraryRepo,
}
api.ds = mockDS
// Create admin user
ctx := request.WithUser(context.Background(), model.User{
ID: "admin-id",
IsAdmin: true,
})
// Create request with single library and empty path
r := httptest.NewRequest("GET", "/rest/startScan?target=1:", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.StartScan(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
// Verify ScanFolders was called (not ScanAll)
Eventually(func() int {
return ms.GetScanFoldersCallCount()
}).Should(BeNumerically(">", 0))
calls := ms.GetScanFoldersCalls()
Expect(calls).To(HaveLen(1))
targets := calls[0].Targets
Expect(targets).To(HaveLen(1))
Expect(targets[0].LibraryID).To(Equal(1))
Expect(targets[0].FolderPath).To(Equal(""))
})
})
Describe("GetScanStatus", func() {
It("returns scan status", func() {
// Setup mock scanner status
ms.SetStatusResponse(&model.ScannerStatus{
Scanning: false,
Count: 100,
FolderCount: 10,
})
// Create request
ctx := context.Background()
r := httptest.NewRequest("GET", "/rest/getScanStatus", nil)
r = r.WithContext(ctx)
// Call endpoint
response, err := api.GetScanStatus(r)
// Should succeed
Expect(err).ToNot(HaveOccurred())
Expect(response).ToNot(BeNil())
Expect(response.ScanStatus).ToNot(BeNil())
Expect(response.ScanStatus.Scanning).To(BeFalse())
Expect(response.ScanStatus.Count).To(Equal(int64(100)))
Expect(response.ScanStatus.FolderCount).To(Equal(int64(10)))
})
})
})

View File

@ -28,6 +28,10 @@ type MockDataStore struct {
MockedRadio model.RadioRepository
scrobbleBufferMu sync.Mutex
repoMu sync.Mutex
// GC tracking
GCCalled bool
GCError error
}
func (db *MockDataStore) Library(ctx context.Context) model.LibraryRepository {
@ -258,6 +262,10 @@ func (db *MockDataStore) Resource(ctx context.Context, m any) model.ResourceRepo
}
}
func (db *MockDataStore) GC(context.Context) error {
func (db *MockDataStore) GC(context.Context, ...int) error {
db.GCCalled = true
if db.GCError != nil {
return db.GCError
}
return nil
}

120
tests/mock_scanner.go Normal file
View File

@ -0,0 +1,120 @@
package tests
import (
"context"
"sync"
"github.com/navidrome/navidrome/model"
)
// MockScanner implements scanner.Scanner for testing with proper synchronization
type MockScanner struct {
mu sync.Mutex
scanAllCalls []ScanAllCall
scanFoldersCalls []ScanFoldersCall
scanningStatus bool
statusResponse *model.ScannerStatus
}
type ScanAllCall struct {
FullScan bool
}
type ScanFoldersCall struct {
FullScan bool
Targets []model.ScanTarget
}
func NewMockScanner() *MockScanner {
return &MockScanner{
scanAllCalls: make([]ScanAllCall, 0),
scanFoldersCalls: make([]ScanFoldersCall, 0),
}
}
func (m *MockScanner) ScanAll(_ context.Context, fullScan bool) ([]string, error) {
m.mu.Lock()
defer m.mu.Unlock()
m.scanAllCalls = append(m.scanAllCalls, ScanAllCall{FullScan: fullScan})
return nil, nil
}
func (m *MockScanner) ScanFolders(_ context.Context, fullScan bool, targets []model.ScanTarget) ([]string, error) {
m.mu.Lock()
defer m.mu.Unlock()
// Make a copy of targets to avoid race conditions
targetsCopy := make([]model.ScanTarget, len(targets))
copy(targetsCopy, targets)
m.scanFoldersCalls = append(m.scanFoldersCalls, ScanFoldersCall{
FullScan: fullScan,
Targets: targetsCopy,
})
return nil, nil
}
func (m *MockScanner) Status(_ context.Context) (*model.ScannerStatus, error) {
m.mu.Lock()
defer m.mu.Unlock()
if m.statusResponse != nil {
return m.statusResponse, nil
}
return &model.ScannerStatus{
Scanning: m.scanningStatus,
}, nil
}
func (m *MockScanner) GetScanAllCallCount() int {
m.mu.Lock()
defer m.mu.Unlock()
return len(m.scanAllCalls)
}
func (m *MockScanner) GetScanAllCalls() []ScanAllCall {
m.mu.Lock()
defer m.mu.Unlock()
// Return a copy to avoid race conditions
calls := make([]ScanAllCall, len(m.scanAllCalls))
copy(calls, m.scanAllCalls)
return calls
}
func (m *MockScanner) GetScanFoldersCallCount() int {
m.mu.Lock()
defer m.mu.Unlock()
return len(m.scanFoldersCalls)
}
func (m *MockScanner) GetScanFoldersCalls() []ScanFoldersCall {
m.mu.Lock()
defer m.mu.Unlock()
// Return a copy to avoid race conditions
calls := make([]ScanFoldersCall, len(m.scanFoldersCalls))
copy(calls, m.scanFoldersCalls)
return calls
}
func (m *MockScanner) Reset() {
m.mu.Lock()
defer m.mu.Unlock()
m.scanAllCalls = make([]ScanAllCall, 0)
m.scanFoldersCalls = make([]ScanFoldersCall, 0)
}
func (m *MockScanner) SetScanning(scanning bool) {
m.mu.Lock()
defer m.mu.Unlock()
m.scanningStatus = scanning
}
func (m *MockScanner) SetStatusResponse(status *model.ScannerStatus) {
m.mu.Lock()
defer m.mu.Unlock()
m.statusResponse = status
}

View File

@ -302,6 +302,8 @@
},
"actions": {
"scan": "Scan Library",
"quickScan": "Quick Scan",
"fullScan": "Full Scan",
"manageUsers": "Manage User Access",
"viewDetails": "View Details"
},
@ -310,6 +312,9 @@
"updated": "Library updated successfully",
"deleted": "Library deleted successfully",
"scanStarted": "Library scan started",
"quickScanStarted": "Quick scan started",
"fullScanStarted": "Full scan started",
"scanError": "Error starting scan. Check logs",
"scanCompleted": "Library scan completed"
},
"validation": {
@ -600,11 +605,12 @@
"activity": {
"title": "Activity",
"totalScanned": "Total Folders Scanned",
"quickScan": "Quick Scan",
"fullScan": "Full Scan",
"quickScan": "Quick",
"fullScan": "Full",
"selectiveScan": "Selective",
"serverUptime": "Server Uptime",
"serverDown": "OFFLINE",
"scanType": "Type",
"scanType": "Last Scan",
"status": "Scan Error",
"elapsedTime": "Elapsed Time"
},

View File

@ -113,6 +113,9 @@ const ActivityPanel = () => {
return translate('activity.fullScan')
case 'quick':
return translate('activity.quickScan')
case 'full-selective':
case 'quick-selective':
return translate('activity.selectiveScan')
default:
return ''
}

View File

@ -10,6 +10,8 @@ import {
} from 'react-admin'
import { useMediaQuery } from '@material-ui/core'
import { List, DateField, useResourceRefresh, SizeField } from '../common'
import LibraryListBulkActions from './LibraryListBulkActions'
import LibraryListActions from './LibraryListActions'
const LibraryFilter = (props) => (
<Filter {...props} variant={'outlined'}>
@ -26,8 +28,9 @@ const LibraryList = (props) => {
{...props}
sort={{ field: 'name', order: 'ASC' }}
exporter={false}
bulkActionButtons={false}
bulkActionButtons={!isXsmall && <LibraryListBulkActions />}
filters={<LibraryFilter />}
actions={<LibraryListActions />}
>
{isXsmall ? (
<SimpleList

View File

@ -0,0 +1,30 @@
import React, { cloneElement } from 'react'
import { sanitizeListRestProps, TopToolbar } from 'react-admin'
import LibraryScanButton from './LibraryScanButton'
const LibraryListActions = ({
className,
filters,
resource,
showFilter,
displayedFilters,
filterValues,
...rest
}) => {
return (
<TopToolbar className={className} {...sanitizeListRestProps(rest)}>
{filters &&
cloneElement(filters, {
resource,
showFilter,
displayedFilters,
filterValues,
context: 'button',
})}
<LibraryScanButton fullScan={false} />
<LibraryScanButton fullScan={true} />
</TopToolbar>
)
}
export default LibraryListActions

View File

@ -0,0 +1,11 @@
import React from 'react'
import LibraryScanButton from './LibraryScanButton'
const LibraryListBulkActions = (props) => (
<>
<LibraryScanButton fullScan={false} {...props} />
<LibraryScanButton fullScan={true} {...props} />
</>
)
export default LibraryListBulkActions

View File

@ -0,0 +1,77 @@
import React, { useState } from 'react'
import PropTypes from 'prop-types'
import {
Button,
useNotify,
useRefresh,
useTranslate,
useUnselectAll,
} from 'react-admin'
import { useSelector } from 'react-redux'
import SyncIcon from '@material-ui/icons/Sync'
import CachedIcon from '@material-ui/icons/Cached'
import subsonic from '../subsonic'
const LibraryScanButton = ({ fullScan, selectedIds, className }) => {
const [loading, setLoading] = useState(false)
const notify = useNotify()
const refresh = useRefresh()
const translate = useTranslate()
const unselectAll = useUnselectAll()
const scanStatus = useSelector((state) => state.activity.scanStatus)
const handleClick = async () => {
setLoading(true)
try {
// Build scan options
const options = { fullScan }
// If specific libraries are selected, scan only those
// Format: "libraryID:" to scan entire library (no folder path specified)
if (selectedIds && selectedIds.length > 0) {
options.target = selectedIds.map((id) => `${id}:`)
}
await subsonic.startScan(options)
const notificationKey = fullScan
? 'resources.library.notifications.fullScanStarted'
: 'resources.library.notifications.quickScanStarted'
notify(notificationKey, 'info')
refresh()
// Unselect all items after successful scan
unselectAll('library')
} catch (error) {
notify('resources.library.notifications.scanError', 'warning')
} finally {
setLoading(false)
}
}
const isDisabled = loading || scanStatus.scanning
const label = fullScan
? translate('resources.library.actions.fullScan')
: translate('resources.library.actions.quickScan')
const icon = fullScan ? <CachedIcon /> : <SyncIcon />
return (
<Button
onClick={handleClick}
disabled={isDisabled}
label={label}
className={className}
>
{icon}
</Button>
)
}
LibraryScanButton.propTypes = {
fullScan: PropTypes.bool.isRequired,
selectedIds: PropTypes.array,
className: PropTypes.string,
}
export default LibraryScanButton

View File

@ -23,7 +23,13 @@ const url = (command, id, options) => {
delete options.ts
}
Object.keys(options).forEach((k) => {
params.append(k, options[k])
const value = options[k]
// Handle array parameters by appending each value separately
if (Array.isArray(value)) {
value.forEach((v) => params.append(k, v))
} else {
params.append(k, value)
}
})
}
return `/rest/${command}?${params.toString()}`

View File

@ -171,3 +171,14 @@ func SeqFunc[I, O any](s []I, f func(I) O) iter.Seq[O] {
}
}
}
// Filter returns a new slice containing only the elements of s for which filterFunc returns true
func Filter[T any](s []T, filterFunc func(T) bool) []T {
var result []T
for _, item := range s {
if filterFunc(item) {
result = append(result, item)
}
}
return result
}

View File

@ -172,4 +172,42 @@ var _ = Describe("Slice Utils", func() {
Expect(result).To(ConsistOf("2", "4", "6", "8"))
})
})
Describe("Filter", func() {
It("returns empty slice for an empty input", func() {
filterFunc := func(v int) bool { return v > 0 }
result := slice.Filter([]int{}, filterFunc)
Expect(result).To(BeEmpty())
})
It("returns all elements when filter matches all", func() {
filterFunc := func(v int) bool { return v > 0 }
result := slice.Filter([]int{1, 2, 3, 4}, filterFunc)
Expect(result).To(HaveExactElements(1, 2, 3, 4))
})
It("returns empty slice when filter matches none", func() {
filterFunc := func(v int) bool { return v > 10 }
result := slice.Filter([]int{1, 2, 3, 4}, filterFunc)
Expect(result).To(BeEmpty())
})
It("returns only matching elements", func() {
filterFunc := func(v int) bool { return v%2 == 0 }
result := slice.Filter([]int{1, 2, 3, 4, 5, 6}, filterFunc)
Expect(result).To(HaveExactElements(2, 4, 6))
})
It("works with string slices", func() {
filterFunc := func(s string) bool { return len(s) > 3 }
result := slice.Filter([]string{"a", "abc", "abcd", "ab", "abcde"}, filterFunc)
Expect(result).To(HaveExactElements("abcd", "abcde"))
})
It("preserves order of elements", func() {
filterFunc := func(v int) bool { return v%2 == 1 }
result := slice.Filter([]int{9, 8, 7, 6, 5, 4, 3, 2, 1}, filterFunc)
Expect(result).To(HaveExactElements(9, 7, 5, 3, 1))
})
})
})