Allow (was scraper) filters with limit<10

This commit is contained in:
Mike Dilger 2024-02-20 06:41:03 +13:00
parent 452522c39d
commit 3a9f4a7e98
4 changed files with 4 additions and 2 deletions

View File

@ -34,6 +34,7 @@ Filters which are broad are considered scrapers and are not serviced. Filters mu
- A non-empty `authors` list is set and a non-empty `kinds` list is set
- A non-empty `authors` list is set and at least one tag is set.
- A non-empty `kinds` list is set and at least one tag is set.
- Has a limit <= 10.
If you wish to change these rules, change the source code at `nostr.rs:screen_outgoing_event()`

View File

@ -111,6 +111,7 @@ This is a boolean indicating whether or not scraping is allowed. Scraping is any
- A non-empty `authors` list is set and a non-empty `kinds` list is set
- A non-empty `authors` list is set and at least one tag is set.
- A non-empty `kinds` list is set and at least one tag is set.
- Has a limit <= 10
Filter that fail to match these conditions will be rejected if `allow_scraping` is false.

View File

@ -191,7 +191,7 @@ impl WebSocketService {
NostrReplyPrefix::Restricted,
PERSONAL_MSG.to_owned(),
)
},
}
_ => NostrReply::Ok(id, false, NostrReplyPrefix::Error, format!("{}", e)),
},
};

View File

@ -354,7 +354,7 @@ impl Store {
}
}
}
} else if self.allow_scraping {
} else if self.allow_scraping || filter.limit() <= 10 {
// This is INEFFICIENT as it scans through EVERY EVENT
// but the filter is a scraper and we don't have a lot of support
// for scrapers.