No description
Find a file
Drew DeVault 8941c46191 Use ts_rank_cd rather than <=>
We may want to evaluate this more later but for now I need to reduce the
number of independent variables while testing indexing changes
2022-07-13 11:27:55 +02:00
cmd sh-admin: new command 2022-07-13 10:20:57 +02:00
config Fix searchut typo in the config file path 2022-07-11 13:17:16 +02:00
crawler crawler: trim excerpt 2022-07-13 10:26:22 +02:00
database database: add middleware 2022-07-09 13:52:55 +02:00
graph Use ts_rank_cd rather than <=> 2022-07-13 11:27:55 +02:00
import import/*: fix page_size issues 2022-07-13 10:24:26 +02:00
query web: add total pages indexed to home page 2022-07-11 20:40:53 +02:00
static web: match alert's dark theme colors with sr.ht 2022-07-13 10:14:35 +02:00
templates web: add total pages indexed to home page 2022-07-11 20:40:53 +02:00
.gitignore .gitignore: add sh-admin 2022-07-13 11:27:55 +02:00
config.example.ini sh-api: expand top-level server riggings 2022-07-09 15:39:04 +02:00
COPYING Initial commit 2022-07-08 19:46:11 +02:00
go.mod crawler: trim excerpt 2022-07-13 10:26:22 +02:00
go.sum crawler: trim excerpt 2022-07-13 10:26:22 +02:00
gqlgen.yml API: Implement search resolver 2022-07-09 15:48:03 +02:00
Makefile sh-admin: new command 2022-07-13 10:20:57 +02:00
README.md Add README.md 2022-07-08 20:55:55 +02:00
schema.sql schema: add default for domain tags 2022-07-13 10:20:27 +02:00

WIP

Why is this crawling my site?

This crawler is still under development. It respects robots.txt Disallow and Crawl-Delay directives. But, if it's annoying you, email sir@cmpwn.com and I'll knock it off.