Merge branch 'master' into feature/32-suggestions
This commit is contained in:
26
CHANGELOG.md
26
CHANGELOG.md
@@ -1,11 +1,33 @@
|
||||
# Omnisearch Changelog
|
||||
|
||||
## 1.2.0, 1.2.1
|
||||
## 1.3.x
|
||||
|
||||
### New
|
||||
|
||||
* Chinese support by @aidenlx in https://github.com/scambier/obsidian-omnisearch/pull/37
|
||||
* You need to install https://github.com/aidenlx/cm-chs-patch to enable this feature
|
||||
* Settings page https://github.com/scambier/obsidian-omnisearch/issues/41
|
||||
* Do not show indexing Notice by default by @chrisgrieser in https://github.com/scambier/obsidian-omnisearch/pull/46
|
||||
* Include notes that don't exist https://github.com/scambier/obsidian-omnisearch/issues/14
|
||||
|
||||
### Improved
|
||||
|
||||
* Better accessibility https://github.com/scambier/obsidian-omnisearch/issues/50
|
||||
* Note aliases are now scored as high as the filename in search results https://github.com/scambier/obsidian-omnisearch/issues/34
|
||||
* By default, reindexing is now done when the app is out of focus, and not after each save https://github.com/scambier/obsidian-omnisearch/issues/57
|
||||
* On mobile, indexing is only done at startup
|
||||
|
||||
### Fixed
|
||||
|
||||
* Showing an error when a note can't be created https://github.com/scambier/obsidian-omnisearch/issues/52
|
||||
|
||||
|
||||
## 1.2.x
|
||||
|
||||
### New
|
||||
* #42 Files that are present in Obsidian's "Excluded Files" list are downranked by a factor of 3 (_desktop only_)
|
||||
|
||||
## 1.2.1
|
||||
## 1.1.1
|
||||
|
||||
### Fixes
|
||||
* Fixed a crash when no results were returned
|
||||
|
||||
@@ -13,6 +13,12 @@ Please read this document before beginning work on a Pull Request.
|
||||
- Omnisearch is still in its infancy: some important features are missing, and there will be architectural changes.
|
||||
- As such, I may refuse your PR simply because it will have to be refactored in a short-ish term
|
||||
|
||||
## "Good First Issue"
|
||||
|
||||
Are you a beginner, looking for a small open source contribution? Look at the "[good first issues](https://github.com/scambier/obsidian-omnisearch/labels/good%20first%20issue)". Those issues have a limited scope, don't require intricate knowledge of the code, and are easy enough to locate, fix, and test.
|
||||
|
||||
If you wish to work on one of these issues, leave a comment and I'll assign it to you and give you some pointers.
|
||||
|
||||
## Code guidelines
|
||||
|
||||
- Respect the existing style
|
||||
@@ -38,4 +44,5 @@ Always respect those UI & UX points:
|
||||
|
||||
## Style guidelines
|
||||
|
||||
(todo)
|
||||
- .ts files must be formatted with "Prettier ESLint"
|
||||
- .svelte files must be formatted with "Svelte for VS Code"
|
||||
|
||||
18
README.md
18
README.md
@@ -1,9 +1,11 @@
|
||||
# Omnisearch for Obsidian
|
||||
|
||||
[](https://github.com/sponsors/scambier)
|
||||
|
||||
  [](https://gist.github.com/cheerfulstoic/d107229326a01ff0f333a1d3476e068d)
|
||||
|
||||
|
||||
**Omnisearch** is a search engine that "_just works_". Type what you're looking for, and it will instantly show you the most relevant results.
|
||||
**Omnisearch** is a search engine that "_just works_". It always instantly shows you the most relevant results, thanks to its smart weighting algorithm.
|
||||
|
||||
Under the hood, it uses the excellent [MiniSearch](https://github.com/lucaong/minisearch) library.
|
||||
|
||||
@@ -11,10 +13,10 @@ Under the hood, it uses the excellent [MiniSearch](https://github.com/lucaong/mi
|
||||
|
||||
## Features
|
||||
|
||||
- Keyboard-centric, you never have to use your mouse
|
||||
- Automatic document scoring using the [BM25 algorithm](https://github.com/lucaong/minisearch/issues/129#issuecomment-1046257399)
|
||||
- The relevance of a document against a query depends on the number of times the query terms appear in the document, its filename, and its headings
|
||||
- Instant search results, with highlighting
|
||||
- Keyboard first: you never have to use your mouse
|
||||
- Instant & highlighted search results
|
||||
- Resistance to typos
|
||||
- In-file search to quickly skim multiple results in a single note
|
||||
- Search filters: expressions in quotes and exclusions
|
||||
@@ -22,8 +24,8 @@ Under the hood, it uses the excellent [MiniSearch](https://github.com/lucaong/mi
|
||||
|
||||
## Installation
|
||||
|
||||
- Omnisearch is available on [the official Community Plugins repository](https://obsidian.md/plugins?search=omnisearch#).
|
||||
- You can also install it through [BRAT](https://github.com/TfTHacker/obsidian42-brat) for pre-releases. Be advised that those versions can be buggy.
|
||||
- Omnisearch is available on [the official Community Plugins repository](https://obsidian.md/plugins?search=Omnisearch).
|
||||
- Beta releases can be installed through [BRAT](https://github.com/TfTHacker/obsidian42-brat). **Be advised that those versions can be buggy.**
|
||||
|
||||
You can check the [CHANGELOG](./CHANGELOG.md) for more information on the different versions.
|
||||
|
||||
@@ -33,13 +35,13 @@ Omnisearch can be used within 2 different contexts:
|
||||
|
||||
### Vault Search
|
||||
|
||||
Omnisearch's core feature, accessible with the Command Palette "_Omnisearch: Vault search_". This modal searches through your vault and returns the most relevant notes first. The notes that contain the query terms in their filename or headings are weighted higher than the others.
|
||||
Omnisearch's core feature, accessible with the Command Palette "**_Omnisearch: Vault search_**". This modal searches through your vault and returns the most relevant notes. That's all you need to _find_ a note.
|
||||
|
||||
If you need to list all the matches of a single note, you can do so by using `alt+enter` to open the In-File Search.
|
||||
If you want to list all the search matches of a single note, you can do so by using `alt+enter` to open the In-File Search.
|
||||
|
||||
### In-File Search
|
||||
|
||||
Also accessible through the command palette "_Omnisearch: In-file search_". This modal searches through the active note's content and lists the results.
|
||||
Also accessible through the Command Palette "**_Omnisearch: In-file search_**". This modal searches through the active note's content and lists the matching results. Just press enter to automatically scroll to the right place.
|
||||
|
||||
## Customization
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"id": "omnisearch",
|
||||
"name": "Omnisearch",
|
||||
"version": "1.2.1",
|
||||
"version": "1.3.5-beta3",
|
||||
"minAppVersion": "0.14.2",
|
||||
"description": "A search engine that just works",
|
||||
"author": "Simon Cambier",
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"id": "omnisearch",
|
||||
"name": "Omnisearch",
|
||||
"version": "1.2.1",
|
||||
"version": "1.3.4",
|
||||
"minAppVersion": "0.14.2",
|
||||
"description": "A search engine that just works",
|
||||
"author": "Simon Cambier",
|
||||
|
||||
16
package.json
16
package.json
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "scambier.obsidian-search",
|
||||
"version": "1.2.1",
|
||||
"version": "1.3.5-beta3",
|
||||
"description": "A search engine for Obsidian",
|
||||
"main": "dist/main.js",
|
||||
"scripts": {
|
||||
@@ -15,14 +15,14 @@
|
||||
"author": "Simon Cambier",
|
||||
"license": "GPL-3",
|
||||
"devDependencies": {
|
||||
"@babel/preset-env": "^7.16.11",
|
||||
"@babel/preset-env": "^7.17.10",
|
||||
"@babel/preset-typescript": "^7.16.7",
|
||||
"@testing-library/jest-dom": "^5.16.4",
|
||||
"@tsconfig/svelte": "^3.0.0",
|
||||
"@types/jest": "^27.4.1",
|
||||
"@types/node": "^16.11.27",
|
||||
"@typescript-eslint/eslint-plugin": "^5.20.0",
|
||||
"@typescript-eslint/parser": "^5.20.0",
|
||||
"@types/jest": "^27.5.0",
|
||||
"@types/node": "^16.11.34",
|
||||
"@typescript-eslint/eslint-plugin": "^5.23.0",
|
||||
"@typescript-eslint/parser": "^5.23.0",
|
||||
"babel-jest": "^27.5.1",
|
||||
"builtin-modules": "^3.2.0",
|
||||
"esbuild": "0.13.12",
|
||||
@@ -38,11 +38,11 @@
|
||||
"obsidian": "latest",
|
||||
"prettier": "^2.6.2",
|
||||
"prettier-eslint": "^13.0.0",
|
||||
"svelte": "^3.47.0",
|
||||
"svelte": "^3.48.0",
|
||||
"svelte-jester": "^2.3.2",
|
||||
"svelte-preprocess": "^4.10.6",
|
||||
"tslib": "2.3.1",
|
||||
"typescript": "^4.6.3"
|
||||
"typescript": "^4.6.4"
|
||||
},
|
||||
"dependencies": {
|
||||
"minisearch": "^5.0.0-beta1"
|
||||
|
||||
633
pnpm-lock.yaml
generated
633
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
@@ -40,6 +40,14 @@ describe('The Query class', () => {
|
||||
).toBeTruthy()
|
||||
})
|
||||
|
||||
it('should not exclude words when there is no space before', () => {
|
||||
// Act
|
||||
const query = new Query('foo bar-baz')
|
||||
|
||||
// Assert
|
||||
expect(query.exclusions).toHaveLength(0)
|
||||
})
|
||||
|
||||
describe('.getExactTerms()', () => {
|
||||
it('should an array of strings containg "exact" values', () => {
|
||||
// Act
|
||||
|
||||
49
src/__tests__/utils-tests.ts
Normal file
49
src/__tests__/utils-tests.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import type { CachedMetadata } from 'obsidian'
|
||||
import { getAliasesFromMetadata } from '../utils'
|
||||
|
||||
describe('Utils', () => {
|
||||
describe('getAliasesFromMetadata', () => {
|
||||
it('should return an empty array if no metadata is provided', () => {
|
||||
// Act
|
||||
const actual = getAliasesFromMetadata(null)
|
||||
// Assert
|
||||
expect(actual).toEqual([])
|
||||
})
|
||||
it('should return an empty array if no aliases are provided', () => {
|
||||
// Act
|
||||
const actual = getAliasesFromMetadata({})
|
||||
// Assert
|
||||
expect(actual).toEqual([])
|
||||
})
|
||||
it('should return the aliases array as-is', () => {
|
||||
// Arrange
|
||||
const metadata = {
|
||||
frontmatter: { aliases: ['foo', 'bar'] },
|
||||
} as CachedMetadata
|
||||
// Act
|
||||
const actual = getAliasesFromMetadata(metadata)
|
||||
// Assert
|
||||
expect(actual).toEqual(['foo', 'bar'])
|
||||
})
|
||||
it('should convert the aliases string into an array', () => {
|
||||
// Arrange
|
||||
const metadata = {
|
||||
frontmatter: { aliases: 'foo, bar' },
|
||||
} as CachedMetadata
|
||||
// Act
|
||||
const actual = getAliasesFromMetadata(metadata)
|
||||
// Assert
|
||||
expect(actual).toEqual(['foo', 'bar'])
|
||||
})
|
||||
it('should return an empty array if the aliases field is an empty string', () => {
|
||||
// Arrange
|
||||
const metadata = {
|
||||
frontmatter: { aliases: '' },
|
||||
} as CachedMetadata
|
||||
// Act
|
||||
const actual = getAliasesFromMetadata(metadata)
|
||||
// Assert
|
||||
expect(actual).toEqual([])
|
||||
})
|
||||
})
|
||||
})
|
||||
2
src/components/GlyphAddNote.svelte
Normal file
2
src/components/GlyphAddNote.svelte
Normal file
@@ -0,0 +1,2 @@
|
||||
<script lang="ts"></script>
|
||||
<span class="suggestion-flair" aria-label="Not created yet, select to create"><svg viewBox="0 0 100 100" class="add-note-glyph" width="16" height="16"><path fill="currentColor" stroke="currentColor" d="M23.3,6.7c-3.7,0-6.7,3-6.7,6.7v73.3c0,3.7,3,6.7,6.7,6.7h28.4c-3.2-4.8-5.1-10.5-5.1-16.7c0-16.6,13.4-30,30-30 c2.3,0,4.5,0.3,6.7,0.8V31.7c0-0.9-0.3-1.7-1-2.4L60.7,7.6c-0.6-0.6-1.5-1-2.4-1L23.3,6.7z M56.7,13L77,33.3H60 c-1.8,0-3.3-1.5-3.3-3.3L56.7,13z M76.7,53.3c-12.9,0-23.3,10.4-23.3,23.3S63.8,100,76.7,100S100,89.6,100,76.7 S89.6,53.3,76.7,53.3z M76.7,63.3c1.8,0,3.3,1.5,3.3,3.3v6.7h6.7c1.8,0,3.3,1.5,3.3,3.3c0,1.8-1.5,3.3-3.3,3.3H80v6.7 c0,1.8-1.5,3.3-3.3,3.3c-1.8,0-3.3-1.5-3.3-3.3V80h-6.7c-1.8,0-3.3-1.5-3.3-3.3s1.5-3.3,3.3-3.3h6.7v-6.7 C73.3,64.8,74.8,63.3,76.7,63.3L76.7,63.3z"></path></svg></span>
|
||||
@@ -1,5 +1,6 @@
|
||||
<script lang="ts">
|
||||
import { debounce } from "obsidian"
|
||||
import { toggleInputComposition } from "src/globals"
|
||||
import { createEventDispatcher, onMount, tick } from "svelte"
|
||||
|
||||
export let value = ""
|
||||
@@ -22,6 +23,8 @@ const debouncedOnInput = debounce(() => {
|
||||
bind:value
|
||||
bind:this={elInput}
|
||||
on:input={debouncedOnInput}
|
||||
on:compositionstart={(_) => toggleInputComposition(true)}
|
||||
on:compositionend={(_) => toggleInputComposition(false)}
|
||||
type="text"
|
||||
class="prompt-input"
|
||||
placeholder="Type to search through your notes"
|
||||
|
||||
@@ -71,11 +71,13 @@ $: {
|
||||
function getGroups(matches: SearchMatch[]): SearchMatch[][] {
|
||||
const groups: SearchMatch[][] = []
|
||||
let lastOffset = -1
|
||||
let count = 0 // TODO: FIXME: this is a hack to avoid infinite loops
|
||||
while (true) {
|
||||
const group = getGroupedMatches(matches, lastOffset, excerptAfter)
|
||||
if (!group.length) break
|
||||
lastOffset = group.last()!.offset
|
||||
groups.push(group)
|
||||
if (++count > 100) break
|
||||
}
|
||||
return groups
|
||||
}
|
||||
|
||||
@@ -3,13 +3,13 @@ let lastSearch = ""
|
||||
</script>
|
||||
|
||||
<script lang="ts">
|
||||
import { TFile } from "obsidian"
|
||||
import { Notice, TFile } from "obsidian"
|
||||
import { onMount, tick } from "svelte"
|
||||
import InputSearch from "./InputSearch.svelte"
|
||||
import ModalContainer from "./ModalContainer.svelte"
|
||||
import { eventBus, type ResultNote } from "src/globals"
|
||||
import { createNote, openNote } from "src/notes"
|
||||
import { getSuggestions } from "src/search"
|
||||
import { getSuggestions, reindexNotes } from "src/search"
|
||||
import { loopIndex } from "src/utils"
|
||||
import { OmnisearchInFileModal, type OmnisearchVaultModal } from "src/modals"
|
||||
import ResultItemVault from "./ResultItemVault.svelte"
|
||||
@@ -29,6 +29,7 @@ $: if (searchQuery) {
|
||||
}
|
||||
|
||||
onMount(() => {
|
||||
reindexNotes()
|
||||
searchQuery = lastSearch
|
||||
eventBus.on("vault", "enter", onInputEnter)
|
||||
eventBus.on("vault", "shift-enter", onInputShiftEnter)
|
||||
@@ -67,7 +68,13 @@ function onInputCtrlEnter(): void {
|
||||
}
|
||||
|
||||
async function onInputShiftEnter(): Promise<void> {
|
||||
await createNote(searchQuery)
|
||||
try {
|
||||
await createNote(searchQuery)
|
||||
}
|
||||
catch(e) {
|
||||
new Notice((e as Error).message)
|
||||
return
|
||||
}
|
||||
modal.close()
|
||||
}
|
||||
|
||||
|
||||
@@ -1,8 +1,9 @@
|
||||
<script lang="ts">
|
||||
import { createEventDispatcher } from "svelte"
|
||||
import GlyphAddNote from "./GlyphAddNote.svelte"
|
||||
|
||||
export let id: string
|
||||
export let selected = false
|
||||
export let glyph = false
|
||||
</script>
|
||||
|
||||
<div
|
||||
@@ -13,5 +14,8 @@ export let selected = false
|
||||
on:click
|
||||
on:auxclick
|
||||
>
|
||||
{#if glyph}
|
||||
<GlyphAddNote />
|
||||
{/if}
|
||||
<slot />
|
||||
</div>
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
<script lang="ts">
|
||||
import { getNoteFromCache } from "src/notes"
|
||||
import { settings } from "src/settings"
|
||||
import type { ResultNote } from "../globals"
|
||||
import { getMatches } from "../search"
|
||||
import { highlighter, makeExcerpt, stringsToRegex } from "../utils"
|
||||
@@ -10,16 +12,20 @@ export let note: ResultNote
|
||||
$: reg = stringsToRegex(note.foundWords)
|
||||
$: matches = getMatches(note.content, reg)
|
||||
$: cleanedContent = makeExcerpt(note.content, note.matches[0]?.offset ?? -1)
|
||||
$: glyph = getNoteFromCache(note.path)?.doesNotExist
|
||||
$: title = settings.showShortName ? note.basename : note.path
|
||||
</script>
|
||||
|
||||
<ResultItemContainer id={note.path} {selected} on:mousemove on:click>
|
||||
<ResultItemContainer id={note.path} {selected} on:mousemove on:click {glyph}>
|
||||
<span class="omnisearch-result__title">
|
||||
{@html note.basename.replace(reg, highlighter)}
|
||||
{@html title.replace(reg, highlighter)}
|
||||
</span>
|
||||
|
||||
<span class="omnisearch-result__counter">
|
||||
{matches.length} {matches.length > 1 ? "matches" : "match"}
|
||||
</span>
|
||||
{#if matches.length > 0}
|
||||
<span class="omnisearch-result__counter">
|
||||
{matches.length} {matches.length > 1 ? "matches" : "match"}
|
||||
</span>
|
||||
{/if}
|
||||
<div class="omnisearch-result__body">
|
||||
{@html cleanedContent.replace(reg, highlighter)}
|
||||
</div>
|
||||
|
||||
@@ -23,10 +23,17 @@ export type SearchNote = {
|
||||
export type IndexedNote = {
|
||||
path: string
|
||||
basename: string
|
||||
mtime: number
|
||||
|
||||
content: string
|
||||
aliases: string
|
||||
tags: string[],
|
||||
headings1: string
|
||||
headings2: string
|
||||
headings3: string
|
||||
|
||||
doesNotExist?: boolean
|
||||
parent?: string
|
||||
}
|
||||
|
||||
export type SearchMatch = {
|
||||
@@ -46,5 +53,13 @@ export type ResultNote = {
|
||||
matches: SearchMatch[]
|
||||
}
|
||||
|
||||
let inComposition = false
|
||||
export function toggleInputComposition(toggle: boolean): void {
|
||||
inComposition = toggle
|
||||
}
|
||||
export function isInputComposition(): boolean {
|
||||
return inComposition
|
||||
}
|
||||
|
||||
export const SPACE_OR_PUNCTUATION =
|
||||
/[|\n\r -#%-*,-/:;?@[-\]_{}\u00A0\u00A1\u00A7\u00AB\u00B6\u00B7\u00BB\u00BF\u037E\u0387\u055A-\u055F\u0589\u058A\u05BE\u05C0\u05C3\u05C6\u05F3\u05F4\u0609\u060A\u060C\u060D\u061B\u061E\u061F\u066A-\u066D\u06D4\u0700-\u070D\u07F7-\u07F9\u0830-\u083E\u085E\u0964\u0965\u0970\u09FD\u0A76\u0AF0\u0C77\u0C84\u0DF4\u0E4F\u0E5A\u0E5B\u0F04-\u0F12\u0F14\u0F3A-\u0F3D\u0F85\u0FD0-\u0FD4\u0FD9\u0FDA\u104A-\u104F\u10FB\u1360-\u1368\u1400\u166E\u1680\u169B\u169C\u16EB-\u16ED\u1735\u1736\u17D4-\u17D6\u17D8-\u17DA\u1800-\u180A\u1944\u1945\u1A1E\u1A1F\u1AA0-\u1AA6\u1AA8-\u1AAD\u1B5A-\u1B60\u1BFC-\u1BFF\u1C3B-\u1C3F\u1C7E\u1C7F\u1CC0-\u1CC7\u1CD3\u2000-\u200A\u2010-\u2029\u202F-\u2043\u2045-\u2051\u2053-\u205F\u207D\u207E\u208D\u208E\u2308-\u230B\u2329\u232A\u2768-\u2775\u27C5\u27C6\u27E6-\u27EF\u2983-\u2998\u29D8-\u29DB\u29FC\u29FD\u2CF9-\u2CFC\u2CFE\u2CFF\u2D70\u2E00-\u2E2E\u2E30-\u2E4F\u3000-\u3003\u3008-\u3011\u3014-\u301F\u3030\u303D\u30A0\u30FB\uA4FE\uA4FF\uA60D-\uA60F\uA673\uA67E\uA6F2-\uA6F7\uA874-\uA877\uA8CE\uA8CF\uA8F8-\uA8FA\uA8FC\uA92E\uA92F\uA95F\uA9C1-\uA9CD\uA9DE\uA9DF\uAA5C-\uAA5F\uAADE\uAADF\uAAF0\uAAF1\uABEB\uFD3E\uFD3F\uFE10-\uFE19\uFE30-\uFE52\uFE54-\uFE61\uFE63\uFE68\uFE6A\uFE6B\uFF01-\uFF03\uFF05-\uFF0A\uFF0C-\uFF0F\uFF1A\uFF1B\uFF1F\uFF20\uFF3B-\uFF3D\uFF3F\uFF5B\uFF5D\uFF5F-\uFF65]+/u
|
||||
|
||||
19
src/main.ts
19
src/main.ts
@@ -1,14 +1,22 @@
|
||||
import { Plugin, TFile } from 'obsidian'
|
||||
import {
|
||||
addNoteToReindex,
|
||||
addToIndex,
|
||||
initGlobalSearchIndex,
|
||||
removeFromIndex,
|
||||
removeFromIndexByPath,
|
||||
} from './search'
|
||||
import { OmnisearchInFileModal, OmnisearchVaultModal } from './modals'
|
||||
import { loadSettings, SettingsTab } from './settings'
|
||||
import { OmnisearchSuggest } from './suggestions'
|
||||
|
||||
// let mainWindow: { on: any; off: any } | null = null
|
||||
// try {
|
||||
// mainWindow = require('electron').remote.getCurrentWindow()
|
||||
// }
|
||||
// catch (e) {
|
||||
// console.log("Can't load electron, mobile platform")
|
||||
// }
|
||||
|
||||
export default class OmnisearchPlugin extends Plugin {
|
||||
async onload(): Promise<void> {
|
||||
await loadSettings(this)
|
||||
@@ -41,19 +49,18 @@ export default class OmnisearchPlugin extends Plugin {
|
||||
)
|
||||
this.registerEvent(
|
||||
this.app.vault.on('delete', file => {
|
||||
removeFromIndex(file)
|
||||
removeFromIndex(file.path)
|
||||
}),
|
||||
)
|
||||
this.registerEvent(
|
||||
this.app.vault.on('modify', async file => {
|
||||
removeFromIndex(file)
|
||||
await addToIndex(file)
|
||||
addNoteToReindex(file)
|
||||
}),
|
||||
)
|
||||
this.registerEvent(
|
||||
this.app.vault.on('rename', async (file, oldPath) => {
|
||||
if (file instanceof TFile && file.path.endsWith('.md')) {
|
||||
removeFromIndexByPath(oldPath)
|
||||
removeFromIndex(oldPath)
|
||||
await addToIndex(file)
|
||||
}
|
||||
}),
|
||||
@@ -62,4 +69,6 @@ export default class OmnisearchPlugin extends Plugin {
|
||||
await initGlobalSearchIndex()
|
||||
})
|
||||
}
|
||||
|
||||
onunload(): void {}
|
||||
}
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { App, Modal, TFile } from 'obsidian'
|
||||
import ModalVault from './components/ModalVault.svelte'
|
||||
import ModalInFile from './components/ModalInFile.svelte'
|
||||
import { eventBus } from './globals'
|
||||
import { eventBus, isInputComposition } from './globals'
|
||||
import { settings } from './settings'
|
||||
|
||||
abstract class OmnisearchModal extends Modal {
|
||||
constructor(app: App) {
|
||||
@@ -17,6 +18,9 @@ abstract class OmnisearchModal extends Modal {
|
||||
this.modalEl.tabIndex = -1
|
||||
|
||||
// Setup events that can be listened through the event bus
|
||||
|
||||
// #region Up/Down navigation
|
||||
|
||||
this.scope.register([], 'ArrowDown', e => {
|
||||
e.preventDefault()
|
||||
eventBus.emit('arrow-down')
|
||||
@@ -25,6 +29,39 @@ abstract class OmnisearchModal extends Modal {
|
||||
e.preventDefault()
|
||||
eventBus.emit('arrow-up')
|
||||
})
|
||||
|
||||
// Ctrl+j/k
|
||||
for (const key of [
|
||||
{ k: 'j', dir: 'down' },
|
||||
{ k: 'k', dir: 'up' },
|
||||
] as const) {
|
||||
for (const modifier of ['Ctrl', 'Meta'] as const) {
|
||||
this.scope.register([modifier], key.k, e => {
|
||||
if (settings.CtrlJK && this.app.vault.getConfig('vimMode')) {
|
||||
e.preventDefault()
|
||||
eventBus.emit('arrow-' + key.dir)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Ctrl+n/p
|
||||
for (const key of [
|
||||
{ k: 'n', dir: 'down' },
|
||||
{ k: 'p', dir: 'up' },
|
||||
] as const) {
|
||||
for (const modifier of ['Ctrl', 'Meta'] as const) {
|
||||
this.scope.register([modifier], key.k, e => {
|
||||
if (settings.CtrlNP && this.app.vault.getConfig('vimMode')) {
|
||||
e.preventDefault()
|
||||
eventBus.emit('arrow-' + key.dir)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// #endregion Up/Down navigation
|
||||
|
||||
this.scope.register(['Ctrl'], 'Enter', e => {
|
||||
e.preventDefault()
|
||||
eventBus.emit('ctrl-enter') // Open in new pane
|
||||
@@ -33,17 +70,23 @@ abstract class OmnisearchModal extends Modal {
|
||||
e.preventDefault()
|
||||
eventBus.emit('ctrl-enter') // Open in new pane (but on Mac)
|
||||
})
|
||||
|
||||
this.scope.register(['Alt'], 'Enter', e => {
|
||||
e.preventDefault()
|
||||
eventBus.emit('alt-enter') // Open the InFile modal
|
||||
})
|
||||
|
||||
this.scope.register(['Shift'], 'Enter', e => {
|
||||
e.preventDefault()
|
||||
eventBus.emit('shift-enter') // Create a new note
|
||||
})
|
||||
|
||||
this.scope.register([], 'Enter', e => {
|
||||
e.preventDefault()
|
||||
eventBus.emit('enter') // Open in current pane
|
||||
if (!isInputComposition()) {
|
||||
// Check if the user is still typing
|
||||
e.preventDefault()
|
||||
eventBus.emit('enter') // Open in current pane
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
118
src/notes.ts
118
src/notes.ts
@@ -1,6 +1,58 @@
|
||||
import { MarkdownView } from 'obsidian'
|
||||
import type { ResultNote } from './globals'
|
||||
import {
|
||||
MarkdownView,
|
||||
TFile,
|
||||
WorkspaceLeaf,
|
||||
type CachedMetadata,
|
||||
} from 'obsidian'
|
||||
import type { IndexedNote, ResultNote } from './globals'
|
||||
import { stringsToRegex } from './utils'
|
||||
import { settings } from './settings'
|
||||
|
||||
/**
|
||||
* This is an in-memory cache of the notes, with all their computed fields
|
||||
* used by the search engine.
|
||||
* This cache allows us to quickly de-index notes when they are deleted or updated.
|
||||
*/
|
||||
export let notesCache: Record<string, IndexedNote> = {}
|
||||
|
||||
const notesCacheFilePath = `${app.vault.configDir}/plugins/omnisearch/notesCache.json`
|
||||
|
||||
export function resetNotesCache(): void {
|
||||
notesCache = {}
|
||||
}
|
||||
|
||||
export async function loadNotesCache(): Promise<void> {
|
||||
if (
|
||||
settings.storeIndexInFile &&
|
||||
(await app.vault.adapter.exists(notesCacheFilePath))
|
||||
) {
|
||||
try {
|
||||
const json = await app.vault.adapter.read(notesCacheFilePath)
|
||||
notesCache = JSON.parse(json)
|
||||
console.log('Notes cache loaded from the file')
|
||||
}
|
||||
catch (e) {
|
||||
console.trace('Could not load Notes cache from the file')
|
||||
console.error(e)
|
||||
}
|
||||
}
|
||||
|
||||
if (!notesCache) {
|
||||
notesCache = {}
|
||||
}
|
||||
}
|
||||
export function getNoteFromCache(key: string): IndexedNote | undefined {
|
||||
return notesCache[key]
|
||||
}
|
||||
export function getNonExistingNotesFromCache(): IndexedNote[] {
|
||||
return Object.values(notesCache).filter(note => note.doesNotExist)
|
||||
}
|
||||
export function addNoteToCache(filename: string, note: IndexedNote): void {
|
||||
notesCache[filename] = note
|
||||
}
|
||||
export function removeNoteFromCache(key: string): void {
|
||||
delete notesCache[key]
|
||||
}
|
||||
|
||||
export async function openNote(
|
||||
item: ResultNote,
|
||||
@@ -9,7 +61,26 @@ export async function openNote(
|
||||
const reg = stringsToRegex(item.foundWords)
|
||||
reg.exec(item.content)
|
||||
const offset = reg.lastIndex
|
||||
await app.workspace.openLinkText(item.path, '', newPane)
|
||||
|
||||
// Check if the note is already open
|
||||
// const pane = MarkdownView.getPane(item.path)
|
||||
|
||||
// Check if the note is already open,
|
||||
// to avoid opening it twice if the first one is pinned
|
||||
let existing = false
|
||||
app.workspace.iterateAllLeaves(leaf => {
|
||||
if (leaf.view instanceof MarkdownView) {
|
||||
if (leaf.getViewState().state?.file === item.path) {
|
||||
app.workspace.setActiveLeaf(leaf, false, true)
|
||||
existing = true
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
if (!existing) {
|
||||
// Open a new note
|
||||
await app.workspace.openLinkText(item.path, '', newPane)
|
||||
}
|
||||
|
||||
const view = app.workspace.getActiveViewOfType(MarkdownView)
|
||||
if (!view) {
|
||||
@@ -38,5 +109,46 @@ export async function createNote(name: string): Promise<void> {
|
||||
}
|
||||
catch (e) {
|
||||
console.error(e)
|
||||
throw e
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* For a given file, returns a list of links leading to notes that don't exist
|
||||
* @param file
|
||||
* @param metadata
|
||||
* @returns
|
||||
*/
|
||||
export function getNonExistingNotes(
|
||||
file: TFile,
|
||||
metadata: CachedMetadata,
|
||||
): string[] {
|
||||
return (metadata.links ?? [])
|
||||
.map(l => {
|
||||
const path = removeAnchors(l.link)
|
||||
return app.metadataCache.getFirstLinkpathDest(path, file.path)
|
||||
? ''
|
||||
: l.link
|
||||
})
|
||||
.filter(l => !!l)
|
||||
}
|
||||
|
||||
/**
|
||||
* Removes anchors and headings
|
||||
* @param name
|
||||
* @returns
|
||||
*/
|
||||
export function removeAnchors(name: string): string {
|
||||
return name.split(/[\^#]+/)[0]
|
||||
}
|
||||
|
||||
export async function saveNotesCacheToFile(): Promise<void> {
|
||||
const json = JSON.stringify(notesCache)
|
||||
await app.vault.adapter.write(notesCacheFilePath, json)
|
||||
console.log('Notes cache saved to the file')
|
||||
}
|
||||
|
||||
export function isCacheOutdated(file: TFile): boolean {
|
||||
const indexedNote = getNoteFromCache(file.path)
|
||||
return !indexedNote || indexedNote.mtime !== file.stat.mtime
|
||||
}
|
||||
|
||||
336
src/query.ts
336
src/query.ts
@@ -1,4 +1,6 @@
|
||||
import { stripSurroundingQuotes } from './utils'
|
||||
import { settings } from './settings'
|
||||
import { removeDiacritics, stripSurroundingQuotes } from './utils'
|
||||
import { parseQuery } from './vendor/parse-query'
|
||||
|
||||
type QueryToken = {
|
||||
/**
|
||||
@@ -20,6 +22,7 @@ export class Query {
|
||||
public exclusions: QueryToken[] = []
|
||||
|
||||
constructor(text = '') {
|
||||
if (settings.ignoreDiacritics) text = removeDiacritics(text)
|
||||
const tokens = parseQuery(text.toLowerCase(), { tokenize: true })
|
||||
this.exclusions = tokens.exclude.text
|
||||
.map(this.formatToken)
|
||||
@@ -47,334 +50,3 @@ export class Query {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/*!
|
||||
* search-query-parser.js
|
||||
* Original: https://github.com/nepsilon/search-query-parser
|
||||
* Modified by Simon Cambier
|
||||
* Copyright(c) 2014-2019
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
interface SearchParserOptions {
|
||||
offsets?: boolean
|
||||
tokenize: true
|
||||
keywords?: string[]
|
||||
ranges?: string[]
|
||||
alwaysArray?: boolean
|
||||
}
|
||||
|
||||
interface ISearchParserDictionary {
|
||||
[key: string]: any
|
||||
}
|
||||
|
||||
type SearchParserKeyWordOffset = {
|
||||
keyword: string
|
||||
value?: string
|
||||
}
|
||||
|
||||
type SearchParserTextOffset = {
|
||||
text: string
|
||||
}
|
||||
|
||||
type SearchParserOffset = (
|
||||
| SearchParserKeyWordOffset
|
||||
| SearchParserTextOffset
|
||||
) & {
|
||||
offsetStart: number
|
||||
offsetEnd: number
|
||||
}
|
||||
|
||||
interface SearchParserResult extends ISearchParserDictionary {
|
||||
text: string[]
|
||||
offsets: SearchParserOffset[]
|
||||
exclude: { text: string[] }
|
||||
}
|
||||
|
||||
function parseQuery(
|
||||
string: string,
|
||||
options: SearchParserOptions,
|
||||
): SearchParserResult {
|
||||
// Set a default options object when none is provided
|
||||
if (!options) {
|
||||
options = { offsets: true, tokenize: true }
|
||||
}
|
||||
else {
|
||||
// If options offsets was't passed, set it to true
|
||||
options.offsets =
|
||||
typeof options.offsets === 'undefined' ? true : options.offsets
|
||||
}
|
||||
|
||||
if (!string) {
|
||||
string = ''
|
||||
}
|
||||
|
||||
// Our object to store the query object
|
||||
const query: SearchParserResult = {
|
||||
text: [],
|
||||
offsets: [],
|
||||
exclude: { text: [] },
|
||||
}
|
||||
// When offsets is true, create their array
|
||||
if (options.offsets) {
|
||||
query.offsets = []
|
||||
}
|
||||
const exclusion: ISearchParserDictionary & { text: string[] } = { text: [] }
|
||||
const terms = []
|
||||
// Get a list of search terms respecting single and double quotes
|
||||
const regex =
|
||||
/(\S+:'(?:[^'\\]|\\.)*')|(\S+:"(?:[^"\\]|\\.)*")|(-?"(?:[^"\\]|\\.)*")|(-?'(?:[^'\\]|\\.)*')|\S+|\S+:\S+/g
|
||||
let match
|
||||
while ((match = regex.exec(string)) !== null) {
|
||||
let term = match[0]
|
||||
const sepIndex = term.indexOf(':')
|
||||
|
||||
// Terms that contain a `:`
|
||||
if (sepIndex !== -1) {
|
||||
const key = term.slice(0, sepIndex)
|
||||
let val = term.slice(sepIndex + 1)
|
||||
|
||||
// Strip backslashes respecting escapes
|
||||
val = (val + '').replace(/\\(.?)/g, function (s, n1) {
|
||||
switch (n1) {
|
||||
case '\\':
|
||||
return '\\'
|
||||
case '0':
|
||||
return '\u0000'
|
||||
case '':
|
||||
return ''
|
||||
default:
|
||||
return n1
|
||||
}
|
||||
})
|
||||
terms.push({
|
||||
keyword: key,
|
||||
value: val,
|
||||
offsetStart: match.index,
|
||||
offsetEnd: match.index + term.length,
|
||||
})
|
||||
}
|
||||
|
||||
// Other terms
|
||||
else {
|
||||
let isExcludedTerm = false
|
||||
if (term[0] === '-') {
|
||||
isExcludedTerm = true
|
||||
term = term.slice(1)
|
||||
}
|
||||
|
||||
// Strip backslashes respecting escapes
|
||||
term = (term + '').replace(/\\(.?)/g, function (s, n1) {
|
||||
switch (n1) {
|
||||
case '\\':
|
||||
return '\\'
|
||||
case '0':
|
||||
return '\u0000'
|
||||
case '':
|
||||
return ''
|
||||
default:
|
||||
return n1
|
||||
}
|
||||
})
|
||||
|
||||
if (isExcludedTerm) {
|
||||
exclusion.text.push(term)
|
||||
}
|
||||
else {
|
||||
terms.push({
|
||||
text: term,
|
||||
offsetStart: match.index,
|
||||
offsetEnd: match.index + term.length,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
// Reverse to ensure proper order when pop()'ing.
|
||||
terms.reverse()
|
||||
// For each search term
|
||||
let term
|
||||
while ((term = terms.pop())) {
|
||||
// When just a simple term
|
||||
if (term.text) {
|
||||
// We add it as pure text
|
||||
query.text.push(term.text)
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push(term)
|
||||
}
|
||||
}
|
||||
// We got an advanced search syntax
|
||||
else if (term.keyword) {
|
||||
let key = term.keyword
|
||||
// Check if the key is a registered keyword
|
||||
options.keywords = options.keywords || []
|
||||
let isKeyword = false
|
||||
let isExclusion = false
|
||||
if (!/^-/.test(key)) {
|
||||
isKeyword = !(options.keywords.indexOf(key) === -1)
|
||||
}
|
||||
else if (key[0] === '-') {
|
||||
const _key = key.slice(1)
|
||||
isKeyword = !(options.keywords.indexOf(_key) === -1)
|
||||
if (isKeyword) {
|
||||
key = _key
|
||||
isExclusion = true
|
||||
}
|
||||
}
|
||||
|
||||
// Check if the key is a registered range
|
||||
options.ranges = options.ranges || []
|
||||
const isRange = !(options.ranges.indexOf(key) === -1)
|
||||
// When the key matches a keyword
|
||||
if (isKeyword) {
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push({
|
||||
keyword: key,
|
||||
value: term.value,
|
||||
offsetStart: isExclusion ? term.offsetStart + 1 : term.offsetStart,
|
||||
offsetEnd: term.offsetEnd,
|
||||
})
|
||||
}
|
||||
|
||||
const value = term.value
|
||||
// When value is a thing
|
||||
if (value.length) {
|
||||
// Get an array of values when several are there
|
||||
const values = value.split(',')
|
||||
if (isExclusion) {
|
||||
if (exclusion[key]) {
|
||||
// ...many times...
|
||||
if (exclusion[key] instanceof Array) {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ... concatenate both arrays.
|
||||
exclusion[key] = exclusion[key].concat(values)
|
||||
}
|
||||
else {
|
||||
// ... append the current single value.
|
||||
exclusion[key].push(value)
|
||||
}
|
||||
}
|
||||
// We saw that keyword only once before
|
||||
else {
|
||||
// Put both the current value and the new
|
||||
// value in an array
|
||||
exclusion[key] = [exclusion[key]]
|
||||
exclusion[key].push(value)
|
||||
}
|
||||
}
|
||||
// First time we see that keyword
|
||||
else {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ...add all values seen.
|
||||
exclusion[key] = values
|
||||
}
|
||||
// Got only a single value this time
|
||||
else {
|
||||
// Record its value as a string
|
||||
if (options.alwaysArray) {
|
||||
// ...but we always return an array if option alwaysArray is true
|
||||
exclusion[key] = [value]
|
||||
}
|
||||
else {
|
||||
// Record its value as a string
|
||||
exclusion[key] = value
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
// If we already have seen that keyword...
|
||||
if (query[key]) {
|
||||
// ...many times...
|
||||
if (query[key] instanceof Array) {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ... concatenate both arrays.
|
||||
query[key] = query[key].concat(values)
|
||||
}
|
||||
else {
|
||||
// ... append the current single value.
|
||||
query[key].push(value)
|
||||
}
|
||||
}
|
||||
// We saw that keyword only once before
|
||||
else {
|
||||
// Put both the current value and the new
|
||||
// value in an array
|
||||
query[key] = [query[key]]
|
||||
query[key].push(value)
|
||||
}
|
||||
}
|
||||
// First time we see that keyword
|
||||
else {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ...add all values seen.
|
||||
query[key] = values
|
||||
}
|
||||
// Got only a single value this time
|
||||
else {
|
||||
if (options.alwaysArray) {
|
||||
// ...but we always return an array if option alwaysArray is true
|
||||
query[key] = [value]
|
||||
}
|
||||
else {
|
||||
// Record its value as a string
|
||||
query[key] = value
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// The key allows a range
|
||||
else if (isRange) {
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push(term)
|
||||
}
|
||||
|
||||
const value = term.value
|
||||
// Range are separated with a dash
|
||||
const rangeValues = value.split('-')
|
||||
// When both end of the range are specified
|
||||
// keyword:XXXX-YYYY
|
||||
query[key] = {}
|
||||
if (rangeValues.length === 2) {
|
||||
query[key].from = rangeValues[0]
|
||||
query[key].to = rangeValues[1]
|
||||
}
|
||||
// When pairs of ranges are specified
|
||||
// keyword:XXXX-YYYY,AAAA-BBBB
|
||||
// else if (!rangeValues.length % 2) {
|
||||
// }
|
||||
// When only getting a single value,
|
||||
// or an odd number of values
|
||||
else {
|
||||
query[key].from = value
|
||||
}
|
||||
}
|
||||
else {
|
||||
// We add it as pure text
|
||||
const text = term.keyword + ':' + term.value
|
||||
query.text.push(text)
|
||||
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push({
|
||||
text: text,
|
||||
offsetStart: term.offsetStart,
|
||||
offsetEnd: term.offsetEnd,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Return forged query object
|
||||
query.exclude = exclusion
|
||||
return query
|
||||
}
|
||||
|
||||
263
src/search.ts
263
src/search.ts
@@ -1,5 +1,5 @@
|
||||
import { Notice, TFile, type TAbstractFile } from 'obsidian'
|
||||
import MiniSearch, { type SearchResult } from 'minisearch'
|
||||
import { Notice, TAbstractFile, TFile } from 'obsidian'
|
||||
import MiniSearch, { type Options, type SearchResult } from 'minisearch'
|
||||
import {
|
||||
chsRegex,
|
||||
SPACE_OR_PUNCTUATION,
|
||||
@@ -9,15 +9,31 @@ import {
|
||||
} from './globals'
|
||||
import {
|
||||
extractHeadingsFromCache,
|
||||
getAliasesFromMetadata,
|
||||
getTagsFromMetadata,
|
||||
removeDiacritics,
|
||||
stringsToRegex,
|
||||
stripMarkdownCharacters,
|
||||
wait,
|
||||
} from './utils'
|
||||
import type { Query } from './query'
|
||||
import { settings } from './settings'
|
||||
import {
|
||||
removeNoteFromCache,
|
||||
getNoteFromCache,
|
||||
getNonExistingNotes,
|
||||
resetNotesCache,
|
||||
addNoteToCache,
|
||||
removeAnchors,
|
||||
getNonExistingNotesFromCache,
|
||||
loadNotesCache,
|
||||
saveNotesCacheToFile,
|
||||
isCacheOutdated,
|
||||
} from './notes'
|
||||
|
||||
let minisearchInstance: MiniSearch<IndexedNote>
|
||||
let indexedNotes: Record<string, IndexedNote> = {}
|
||||
let isIndexChanged: boolean
|
||||
const searchIndexFilePath = `${app.vault.configDir}/plugins/omnisearch/searchIndex.json`
|
||||
|
||||
const tokenize = (text: string): string[] => {
|
||||
const tokens = text.split(SPACE_OR_PUNCTUATION)
|
||||
@@ -36,38 +52,87 @@ const tokenize = (text: string): string[] => {
|
||||
* and adds all the notes to the index
|
||||
*/
|
||||
export async function initGlobalSearchIndex(): Promise<void> {
|
||||
indexedNotes = {}
|
||||
minisearchInstance = new MiniSearch({
|
||||
const options: Options<IndexedNote> = {
|
||||
tokenize,
|
||||
processTerm: (term: string) =>
|
||||
(settings.ignoreDiacritics ? removeDiacritics(term) : term).toLowerCase(),
|
||||
idField: 'path',
|
||||
fields: ['basename', 'content', 'headings1', 'headings2', 'headings3'],
|
||||
})
|
||||
fields: [
|
||||
'basename',
|
||||
'aliases',
|
||||
'content',
|
||||
'headings1',
|
||||
'headings2',
|
||||
'headings3',
|
||||
],
|
||||
storeFields: ['tags'],
|
||||
}
|
||||
|
||||
if (
|
||||
settings.storeIndexInFile &&
|
||||
(await app.vault.adapter.exists(searchIndexFilePath))
|
||||
) {
|
||||
try {
|
||||
const json = await app.vault.adapter.read(searchIndexFilePath)
|
||||
minisearchInstance = MiniSearch.loadJSON(json, options)
|
||||
console.log('MiniSearch index loaded from the file')
|
||||
await loadNotesCache()
|
||||
}
|
||||
catch (e) {
|
||||
console.trace('Could not load MiniSearch index from the file')
|
||||
console.error(e)
|
||||
}
|
||||
}
|
||||
|
||||
if (!minisearchInstance) {
|
||||
minisearchInstance = new MiniSearch(options)
|
||||
resetNotesCache()
|
||||
}
|
||||
|
||||
// Index files that are already present
|
||||
const start = new Date().getTime()
|
||||
const files = app.vault.getMarkdownFiles()
|
||||
|
||||
const allFiles = app.vault.getMarkdownFiles()
|
||||
|
||||
let files
|
||||
let notesSuffix
|
||||
if (settings.storeIndexInFile) {
|
||||
files = allFiles.filter(file => isCacheOutdated(file))
|
||||
notesSuffix = 'modified notes'
|
||||
}
|
||||
else {
|
||||
files = allFiles
|
||||
notesSuffix = 'notes'
|
||||
}
|
||||
|
||||
console.log(`Omnisearch - indexing ${files.length} ${notesSuffix}`)
|
||||
|
||||
// This is basically the same behavior as MiniSearch's `addAllAsync()`.
|
||||
// We index files by batches of 10
|
||||
if (files.length) {
|
||||
console.log('Omnisearch - indexing ' + files.length + ' files')
|
||||
}
|
||||
for (let i = 0; i < files.length; ++i) {
|
||||
if (i % 10 === 0) await wait(0)
|
||||
const file = files[i]
|
||||
if (file) await addToIndex(file)
|
||||
if (file) {
|
||||
if (getNoteFromCache(file.path)) {
|
||||
removeFromIndex(file.path)
|
||||
}
|
||||
await addToIndex(file)
|
||||
}
|
||||
}
|
||||
|
||||
if (files.length > 0 && settings.showIndexingNotices) {
|
||||
new Notice(
|
||||
`Omnisearch - Indexed ${files.length} notes in ${
|
||||
new Date().getTime() - start
|
||||
}ms`,
|
||||
)
|
||||
}
|
||||
if (files.length > 0) {
|
||||
const message = `Omnisearch - Indexed ${files.length} ${notesSuffix} in ${
|
||||
new Date().getTime() - start
|
||||
}ms`
|
||||
|
||||
// Listen to the query input to trigger a search
|
||||
// subscribeToQuery()
|
||||
console.log(message)
|
||||
|
||||
if (settings.showIndexingNotices) {
|
||||
new Notice(message)
|
||||
}
|
||||
|
||||
await saveIndexToFile()
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -84,6 +149,7 @@ async function search(query: Query): Promise<SearchResult[]> {
|
||||
combineWith: 'AND',
|
||||
boost: {
|
||||
basename: settings.weightBasename,
|
||||
aliases: settings.weightBasename,
|
||||
headings1: settings.weightH1,
|
||||
headings2: settings.weightH2,
|
||||
headings3: settings.weightH3,
|
||||
@@ -106,19 +172,20 @@ async function search(query: Query): Promise<SearchResult[]> {
|
||||
const exactTerms = query.getExactTerms()
|
||||
if (exactTerms.length) {
|
||||
results = results.filter(r => {
|
||||
const title = getNoteFromCache(r.id)?.path.toLowerCase() ?? ''
|
||||
const content = stripMarkdownCharacters(
|
||||
indexedNotes[r.id]?.content ?? '',
|
||||
getNoteFromCache(r.id)?.content ?? '',
|
||||
).toLowerCase()
|
||||
return exactTerms.every(q => content.includes(q))
|
||||
return exactTerms.every(q => content.includes(q) || title.includes(q))
|
||||
})
|
||||
}
|
||||
|
||||
// // If the search query contains exclude terms, filter out results that have them
|
||||
// If the search query contains exclude terms, filter out results that have them
|
||||
const exclusions = query.exclusions
|
||||
if (exclusions.length) {
|
||||
results = results.filter(r => {
|
||||
const content = stripMarkdownCharacters(
|
||||
indexedNotes[r.id]?.content ?? '',
|
||||
getNoteFromCache(r.id)?.content ?? '',
|
||||
).toLowerCase()
|
||||
return exclusions.every(q => !content.includes(q.value))
|
||||
})
|
||||
@@ -135,7 +202,9 @@ async function search(query: Query): Promise<SearchResult[]> {
|
||||
export function getMatches(text: string, reg: RegExp): SearchMatch[] {
|
||||
let match: RegExpExecArray | null = null
|
||||
const matches: SearchMatch[] = []
|
||||
let count = 0 // TODO: FIXME: this is a hack to avoid infinite loops
|
||||
while ((match = reg.exec(text)) !== null) {
|
||||
if (++count > 100) break
|
||||
const m = match[0]
|
||||
if (m) matches.push({ match: m, offset: match.index })
|
||||
}
|
||||
@@ -166,16 +235,32 @@ export async function getSuggestions(
|
||||
else results = []
|
||||
}
|
||||
else {
|
||||
results = results.sort((a, b) => b.score - a.score).slice(0, 50)
|
||||
results = results.slice(0, 50)
|
||||
|
||||
// Put the results with tags on top
|
||||
const tags = query.segments
|
||||
.filter(s => s.value.startsWith('#'))
|
||||
.map(s => s.value)
|
||||
for (const tag of tags) {
|
||||
for (const result of results) {
|
||||
if (result.tags.includes(tag)) {
|
||||
result.score *= 100
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Map the raw results to get usable suggestions
|
||||
const suggestions = results.map(result => {
|
||||
const note = indexedNotes[result.id]
|
||||
const note = getNoteFromCache(result.id)
|
||||
if (!note) {
|
||||
throw new Error(`Note "${result.id}" not indexed`)
|
||||
}
|
||||
|
||||
// Remove '#' from tags, for highlighting
|
||||
query.segments.forEach(s => {
|
||||
s.value = s.value.replace(/^#/, '')
|
||||
})
|
||||
// Clean search matches that match quoted expressions,
|
||||
// and inject those expressions instead
|
||||
const foundWords = [
|
||||
@@ -184,6 +269,7 @@ export async function getSuggestions(
|
||||
),
|
||||
...query.segments.filter(s => s.exact).map(s => s.value),
|
||||
]
|
||||
|
||||
const matches = getMatches(note.content, stringsToRegex(foundWords))
|
||||
const resultNote: ResultNote = {
|
||||
score: result.score,
|
||||
@@ -206,34 +292,56 @@ export async function addToIndex(file: TAbstractFile): Promise<void> {
|
||||
if (!(file instanceof TFile) || file.extension !== 'md') {
|
||||
return
|
||||
}
|
||||
|
||||
// Check if the file was already indexed as non-existent,
|
||||
// and if so, remove it from the index (before adding it again)
|
||||
if (getNoteFromCache(file.path)?.doesNotExist) {
|
||||
removeFromIndex(file.path)
|
||||
}
|
||||
|
||||
try {
|
||||
// console.log(`Omnisearch - adding ${file.path} to index`)
|
||||
const fileCache = app.metadataCache.getFileCache(file)
|
||||
|
||||
if (indexedNotes[file.path]) {
|
||||
// Look for links that lead to non-existing files,
|
||||
// and index them as well
|
||||
const metadata = app.metadataCache.getFileCache(file)
|
||||
if (metadata) {
|
||||
const nonExisting = getNonExistingNotes(file, metadata)
|
||||
for (const name of nonExisting.filter(o => !getNoteFromCache(o))) {
|
||||
addNonExistingToIndex(name, file.path)
|
||||
}
|
||||
}
|
||||
|
||||
if (getNoteFromCache(file.path)) {
|
||||
throw new Error(`${file.basename} is already indexed`)
|
||||
}
|
||||
|
||||
// Fetch content from the cache to index it as-is
|
||||
const content = await app.vault.cachedRead(file)
|
||||
const content = removeDiacritics(await app.vault.cachedRead(file))
|
||||
|
||||
// Make the document and index it
|
||||
const note: IndexedNote = {
|
||||
basename: file.basename,
|
||||
content,
|
||||
path: file.path,
|
||||
headings1: fileCache
|
||||
? extractHeadingsFromCache(fileCache, 1).join(' ')
|
||||
mtime: file.stat.mtime,
|
||||
|
||||
tags: getTagsFromMetadata(metadata),
|
||||
aliases: getAliasesFromMetadata(metadata).join(''),
|
||||
headings1: metadata
|
||||
? extractHeadingsFromCache(metadata, 1).join(' ')
|
||||
: '',
|
||||
headings2: fileCache
|
||||
? extractHeadingsFromCache(fileCache, 2).join(' ')
|
||||
headings2: metadata
|
||||
? extractHeadingsFromCache(metadata, 2).join(' ')
|
||||
: '',
|
||||
headings3: fileCache
|
||||
? extractHeadingsFromCache(fileCache, 3).join(' ')
|
||||
headings3: metadata
|
||||
? extractHeadingsFromCache(metadata, 3).join(' ')
|
||||
: '',
|
||||
}
|
||||
|
||||
minisearchInstance.add(note)
|
||||
indexedNotes[note.path] = note
|
||||
isIndexChanged = true
|
||||
addNoteToCache(note.path, note)
|
||||
}
|
||||
catch (e) {
|
||||
console.trace('Error while indexing ' + file.basename)
|
||||
@@ -242,25 +350,84 @@ export async function addToIndex(file: TAbstractFile): Promise<void> {
|
||||
}
|
||||
|
||||
/**
|
||||
* Removes a file from the index
|
||||
* @param file
|
||||
* @returns
|
||||
* Index a non-existing note.
|
||||
* Useful to find internal links that lead (yet) to nowhere
|
||||
* @param name
|
||||
*/
|
||||
export function removeFromIndex(file: TAbstractFile): void {
|
||||
if (file instanceof TFile && file.path.endsWith('.md')) {
|
||||
// console.log(`Omnisearch - removing ${file.path} from index`)
|
||||
return removeFromIndexByPath(file.path)
|
||||
}
|
||||
export function addNonExistingToIndex(name: string, parent: string): void {
|
||||
name = removeAnchors(name)
|
||||
const filename = name + (name.endsWith('.md') ? '' : '.md')
|
||||
if (getNoteFromCache(filename)) return
|
||||
|
||||
const note = {
|
||||
path: filename,
|
||||
basename: name,
|
||||
mtime: 0,
|
||||
|
||||
content: '',
|
||||
aliases: '',
|
||||
headings1: '',
|
||||
headings2: '',
|
||||
headings3: '',
|
||||
|
||||
doesNotExist: true,
|
||||
parent,
|
||||
} as IndexedNote
|
||||
minisearchInstance.add(note)
|
||||
isIndexChanged = true
|
||||
addNoteToCache(filename, note)
|
||||
}
|
||||
|
||||
/**
|
||||
* Removes a file from the index, by its path
|
||||
* @param path
|
||||
*/
|
||||
export function removeFromIndexByPath(path: string): void {
|
||||
const note = indexedNotes[path]
|
||||
export function removeFromIndex(path: string): void {
|
||||
if (!path.endsWith('.md')) {
|
||||
console.info(`"${path}" is not a .md file`)
|
||||
return
|
||||
}
|
||||
const note = getNoteFromCache(path)
|
||||
if (note) {
|
||||
minisearchInstance.remove(note)
|
||||
delete indexedNotes[path]
|
||||
isIndexChanged = true
|
||||
removeNoteFromCache(path)
|
||||
getNonExistingNotesFromCache()
|
||||
.filter(n => n.parent === path)
|
||||
.forEach(n => {
|
||||
removeFromIndex(n.path)
|
||||
})
|
||||
}
|
||||
else {
|
||||
console.warn(`not not found under path ${path}`)
|
||||
}
|
||||
}
|
||||
|
||||
const notesToReindex = new Set<TAbstractFile>()
|
||||
export function addNoteToReindex(note: TAbstractFile): void {
|
||||
notesToReindex.add(note)
|
||||
}
|
||||
export async function reindexNotes(): Promise<void> {
|
||||
if (settings.showIndexingNotices && notesToReindex.size > 0) {
|
||||
new Notice(`Omnisearch - Reindexing ${notesToReindex.size} notes`, 2000)
|
||||
}
|
||||
for (const note of notesToReindex) {
|
||||
removeFromIndex(note.path)
|
||||
await addToIndex(note)
|
||||
await wait(0)
|
||||
}
|
||||
notesToReindex.clear()
|
||||
|
||||
await saveIndexToFile()
|
||||
}
|
||||
|
||||
async function saveIndexToFile(): Promise<void> {
|
||||
if (settings.storeIndexInFile && minisearchInstance && isIndexChanged) {
|
||||
const json = JSON.stringify(minisearchInstance)
|
||||
await app.vault.adapter.write(searchIndexFilePath, json)
|
||||
console.log('Omnisearch - Index saved on disk')
|
||||
|
||||
await saveNotesCacheToFile()
|
||||
isIndexChanged = false
|
||||
}
|
||||
}
|
||||
|
||||
130
src/settings.ts
130
src/settings.ts
@@ -9,8 +9,13 @@ interface WeightingSettings {
|
||||
}
|
||||
|
||||
export interface OmnisearchSettings extends WeightingSettings {
|
||||
showIndexingNotices: boolean
|
||||
respectExcluded: boolean
|
||||
ignoreDiacritics: boolean
|
||||
showIndexingNotices: boolean
|
||||
showShortName: boolean
|
||||
CtrlJK: boolean
|
||||
CtrlNP: boolean
|
||||
storeIndexInFile: boolean
|
||||
}
|
||||
|
||||
export class SettingsTab extends PluginSettingTab {
|
||||
@@ -25,21 +30,12 @@ export class SettingsTab extends PluginSettingTab {
|
||||
const { containerEl } = this
|
||||
containerEl.empty()
|
||||
|
||||
// Title
|
||||
const title = document.createElement('h2')
|
||||
title.textContent = 'Omnisearch settings'
|
||||
containerEl.appendChild(title)
|
||||
// Settings main title
|
||||
containerEl.createEl('h2', { text: 'Omnisearch settings' })
|
||||
|
||||
// Show notices
|
||||
new Setting(containerEl)
|
||||
.setName('Show indexing notices')
|
||||
.setDesc('Show a notice when indexing is done, usually at startup.')
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.showIndexingNotices).onChange(async v => {
|
||||
settings.showIndexingNotices = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
// #region Behavior
|
||||
|
||||
new Setting(containerEl).setName('Behavior').setHeading()
|
||||
|
||||
// Respect excluded files
|
||||
new Setting(containerEl)
|
||||
@@ -54,10 +50,71 @@ export class SettingsTab extends PluginSettingTab {
|
||||
}),
|
||||
)
|
||||
|
||||
// Ignore diacritics
|
||||
new Setting(containerEl)
|
||||
.setName('Ignore diacritics')
|
||||
.setDesc(
|
||||
'EXPERIMENTAL - Normalize diacritics in search terms. Words like "brûlée" or "žluťoučký" will be indexed as "brulee" and "zlutoucky". Needs a restart to take effect.',
|
||||
)
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.ignoreDiacritics).onChange(async v => {
|
||||
settings.ignoreDiacritics = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
|
||||
new Setting(containerEl)
|
||||
.setName('Store index in file')
|
||||
.setDesc(
|
||||
'EXPERIMENTAL - index is store on disk, instead of being rebuilt on every startup.',
|
||||
)
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.storeIndexInFile).onChange(async v => {
|
||||
settings.storeIndexInFile = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
|
||||
// #endregion Behavior
|
||||
|
||||
// #region User Interface
|
||||
|
||||
new Setting(containerEl).setName('User Interface').setHeading()
|
||||
|
||||
// Show notices
|
||||
new Setting(containerEl)
|
||||
.setName('Show indexing notices')
|
||||
.setDesc('Show a notice when indexing is done, usually at startup.')
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.showIndexingNotices).onChange(async v => {
|
||||
settings.showIndexingNotices = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
|
||||
// Display note names without the full path
|
||||
new Setting(containerEl)
|
||||
.setName('Hide full path in results list')
|
||||
.setDesc(
|
||||
'In the search results, only show the note name, without the full path.',
|
||||
)
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.showShortName).onChange(async v => {
|
||||
settings.showShortName = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
|
||||
// #endregion User Interface
|
||||
|
||||
// #region Results Weighting
|
||||
|
||||
new Setting(containerEl).setName('Results weighting').setHeading()
|
||||
|
||||
new Setting(containerEl)
|
||||
.setName(`File name (default: ${DEFAULT_SETTINGS.weightBasename})`)
|
||||
.setName(
|
||||
`File name & declared aliases (default: ${DEFAULT_SETTINGS.weightBasename})`,
|
||||
)
|
||||
.addSlider(cb => this.weightSlider(cb, 'weightBasename'))
|
||||
|
||||
new Setting(containerEl)
|
||||
@@ -71,6 +128,36 @@ export class SettingsTab extends PluginSettingTab {
|
||||
new Setting(containerEl)
|
||||
.setName(`Headings level 3 (default: ${DEFAULT_SETTINGS.weightH3})`)
|
||||
.addSlider(cb => this.weightSlider(cb, 'weightH3'))
|
||||
|
||||
// #endregion Results Weighting
|
||||
|
||||
// #region Shortcuts
|
||||
|
||||
new Setting(containerEl).setName('Shortcuts').setHeading()
|
||||
|
||||
new Setting(containerEl)
|
||||
.setName(
|
||||
'Use [Ctrl/Cmd]+j/k to navigate up/down in the results, if Vim mode is enabled',
|
||||
)
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.CtrlJK).onChange(async v => {
|
||||
settings.CtrlJK = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
|
||||
new Setting(containerEl)
|
||||
.setName(
|
||||
'Use [Ctrl/Cmd]+n/p to navigate up/down in the results, if Vim mode is enabled',
|
||||
)
|
||||
.addToggle(toggle =>
|
||||
toggle.setValue(settings.CtrlNP).onChange(async v => {
|
||||
settings.CtrlNP = v
|
||||
await saveSettings(this.plugin)
|
||||
}),
|
||||
)
|
||||
|
||||
// #endregion Shortcuts
|
||||
}
|
||||
|
||||
weightSlider(cb: SliderComponent, key: keyof WeightingSettings): void {
|
||||
@@ -85,12 +172,21 @@ export class SettingsTab extends PluginSettingTab {
|
||||
}
|
||||
|
||||
export const DEFAULT_SETTINGS: OmnisearchSettings = {
|
||||
showIndexingNotices: true,
|
||||
respectExcluded: true,
|
||||
ignoreDiacritics: false,
|
||||
|
||||
showIndexingNotices: false,
|
||||
showShortName: false,
|
||||
|
||||
weightBasename: 2,
|
||||
weightH1: 1.5,
|
||||
weightH2: 1.3,
|
||||
weightH3: 1.1,
|
||||
|
||||
CtrlJK: false,
|
||||
CtrlNP: false,
|
||||
|
||||
storeIndexInFile: false,
|
||||
} as const
|
||||
|
||||
export let settings: OmnisearchSettings = Object.assign({}, DEFAULT_SETTINGS)
|
||||
|
||||
17
src/types.d.ts
vendored
17
src/types.d.ts
vendored
@@ -1,7 +1,22 @@
|
||||
import { type MetadataCache } from 'obsidian'
|
||||
import type { MetadataCache, ViewState, Vault } from 'obsidian'
|
||||
|
||||
declare module 'obsidian' {
|
||||
interface MetadataCache {
|
||||
isUserIgnored?(path: string): boolean
|
||||
}
|
||||
|
||||
interface FrontMatterCache {
|
||||
aliases?: string[] | string
|
||||
tags?: string[] | string
|
||||
}
|
||||
|
||||
interface ViewState {
|
||||
state?: {
|
||||
file?: string
|
||||
}
|
||||
}
|
||||
|
||||
interface Vault {
|
||||
getConfig(string): unknown
|
||||
}
|
||||
}
|
||||
|
||||
43
src/utils.ts
43
src/utils.ts
@@ -61,19 +61,6 @@ export function stringsToRegex(strings: string[]): RegExp {
|
||||
return new RegExp(strings.map(s => `(${escapeRegex(s)})`).join('|'), 'gi')
|
||||
}
|
||||
|
||||
export function replaceAll(
|
||||
text: string,
|
||||
terms: string[],
|
||||
cb: (t: string) => string,
|
||||
): string {
|
||||
terms.sort((a, b) => a.length - b.length)
|
||||
const regs = terms.map(term => new RegExp(escapeRegex(term), 'gi'))
|
||||
for (const reg of regs) {
|
||||
text = text.replaceAll(reg, cb)
|
||||
}
|
||||
return text
|
||||
}
|
||||
|
||||
export function extractHeadingsFromCache(
|
||||
cache: CachedMetadata,
|
||||
level: number,
|
||||
@@ -147,3 +134,33 @@ export async function filterAsync<T>(
|
||||
export function stripMarkdownCharacters(text: string): string {
|
||||
return text.replace(/(\*|_)+(.+?)(\*|_)+/g, (match, p1, p2) => p2)
|
||||
}
|
||||
|
||||
export function getAliasesFromMetadata(
|
||||
metadata: CachedMetadata | null,
|
||||
): string[] {
|
||||
const arrOrString = metadata?.frontmatter?.aliases ?? []
|
||||
return (Array.isArray(arrOrString) ? arrOrString : arrOrString.split(','))
|
||||
.map(s => (s ? s.trim() : s))
|
||||
.filter(s => !!s)
|
||||
}
|
||||
|
||||
export function getTagsFromMetadata(metadata: CachedMetadata | null): string[] {
|
||||
const arrOrString = metadata?.frontmatter?.tags ?? []
|
||||
const fromFrontMatter = (
|
||||
Array.isArray(arrOrString) ? arrOrString : arrOrString.split(',')
|
||||
)
|
||||
.map(s => (s ? s.trim() : s))
|
||||
.filter(s => !!s)
|
||||
const fromBody = (metadata?.tags ?? []).map(t => t.tag)
|
||||
|
||||
return [...fromFrontMatter, ...fromBody].map(t =>
|
||||
t[0] !== '#' ? '#' + t : t,
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* https://stackoverflow.com/a/37511463
|
||||
*/
|
||||
export function removeDiacritics(str: string): string {
|
||||
return str.normalize('NFD').replace(/\p{Diacritic}/gu, '')
|
||||
}
|
||||
|
||||
332
src/vendor/parse-query.ts
vendored
Normal file
332
src/vendor/parse-query.ts
vendored
Normal file
@@ -0,0 +1,332 @@
|
||||
/*!
|
||||
* search-query-parser.js
|
||||
* Original: https://github.com/nepsilon/search-query-parser
|
||||
* Modified by Simon Cambier
|
||||
* Copyright(c) 2014-2019
|
||||
* MIT Licensed
|
||||
*/
|
||||
|
||||
interface SearchParserOptions {
|
||||
offsets?: boolean
|
||||
tokenize: true
|
||||
keywords?: string[]
|
||||
ranges?: string[]
|
||||
alwaysArray?: boolean
|
||||
}
|
||||
|
||||
interface ISearchParserDictionary {
|
||||
[key: string]: any
|
||||
}
|
||||
|
||||
type SearchParserKeyWordOffset = {
|
||||
keyword: string
|
||||
value?: string
|
||||
}
|
||||
|
||||
type SearchParserTextOffset = {
|
||||
text: string
|
||||
}
|
||||
|
||||
type SearchParserOffset = (
|
||||
| SearchParserKeyWordOffset
|
||||
| SearchParserTextOffset
|
||||
) & {
|
||||
offsetStart: number
|
||||
offsetEnd: number
|
||||
}
|
||||
|
||||
interface SearchParserResult extends ISearchParserDictionary {
|
||||
text: string[]
|
||||
offsets: SearchParserOffset[]
|
||||
exclude: { text: string[] }
|
||||
}
|
||||
|
||||
export function parseQuery(
|
||||
string: string,
|
||||
options: SearchParserOptions,
|
||||
): SearchParserResult {
|
||||
// Set a default options object when none is provided
|
||||
if (!options) {
|
||||
options = { offsets: true, tokenize: true }
|
||||
}
|
||||
else {
|
||||
// If options offsets was't passed, set it to true
|
||||
options.offsets =
|
||||
typeof options.offsets === 'undefined' ? true : options.offsets
|
||||
}
|
||||
|
||||
if (!string) {
|
||||
string = ''
|
||||
}
|
||||
|
||||
// Our object to store the query object
|
||||
const query: SearchParserResult = {
|
||||
text: [],
|
||||
offsets: [],
|
||||
exclude: { text: [] },
|
||||
}
|
||||
// When offsets is true, create their array
|
||||
if (options.offsets) {
|
||||
query.offsets = []
|
||||
}
|
||||
const exclusion: ISearchParserDictionary & { text: string[] } = { text: [] }
|
||||
const terms = []
|
||||
// Get a list of search terms respecting single and double quotes
|
||||
const regex =
|
||||
/(\S+:'(?:[^'\\]|\\.)*')|(\S+:"(?:[^"\\]|\\.)*")|(-?"(?:[^"\\]|\\.)*")|(-?'(?:[^'\\]|\\.)*')|\S+|\S+:\S+/g
|
||||
let match
|
||||
let count = 0 // TODO: FIXME: this is a hack to avoid infinite loops
|
||||
while ((match = regex.exec(string)) !== null) {
|
||||
if (++count > 100) break
|
||||
let term = match[0]
|
||||
const sepIndex = term.indexOf(':')
|
||||
|
||||
// Terms that contain a `:`
|
||||
if (sepIndex !== -1) {
|
||||
const key = term.slice(0, sepIndex)
|
||||
let val = term.slice(sepIndex + 1)
|
||||
|
||||
// Strip backslashes respecting escapes
|
||||
val = (val + '').replace(/\\(.?)/g, function (s, n1) {
|
||||
switch (n1) {
|
||||
case '\\':
|
||||
return '\\'
|
||||
case '0':
|
||||
return '\u0000'
|
||||
case '':
|
||||
return ''
|
||||
default:
|
||||
return n1
|
||||
}
|
||||
})
|
||||
terms.push({
|
||||
keyword: key,
|
||||
value: val,
|
||||
offsetStart: match.index,
|
||||
offsetEnd: match.index + term.length,
|
||||
})
|
||||
}
|
||||
|
||||
// Other terms
|
||||
else {
|
||||
let isExcludedTerm = false
|
||||
if (term[0] === '-') {
|
||||
isExcludedTerm = true
|
||||
term = term.slice(1)
|
||||
}
|
||||
|
||||
// Strip backslashes respecting escapes
|
||||
term = (term + '').replace(/\\(.?)/g, function (s, n1) {
|
||||
switch (n1) {
|
||||
case '\\':
|
||||
return '\\'
|
||||
case '0':
|
||||
return '\u0000'
|
||||
case '':
|
||||
return ''
|
||||
default:
|
||||
return n1
|
||||
}
|
||||
})
|
||||
|
||||
if (isExcludedTerm) {
|
||||
exclusion.text.push(term)
|
||||
}
|
||||
else {
|
||||
terms.push({
|
||||
text: term,
|
||||
offsetStart: match.index,
|
||||
offsetEnd: match.index + term.length,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
// Reverse to ensure proper order when pop()'ing.
|
||||
terms.reverse()
|
||||
// For each search term
|
||||
let term
|
||||
while ((term = terms.pop())) {
|
||||
// When just a simple term
|
||||
if (term.text) {
|
||||
// We add it as pure text
|
||||
query.text.push(term.text)
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push(term)
|
||||
}
|
||||
}
|
||||
// We got an advanced search syntax
|
||||
else if (term.keyword) {
|
||||
let key = term.keyword
|
||||
// Check if the key is a registered keyword
|
||||
options.keywords = options.keywords || []
|
||||
let isKeyword = false
|
||||
let isExclusion = false
|
||||
if (!/^-/.test(key)) {
|
||||
isKeyword = !(options.keywords.indexOf(key) === -1)
|
||||
}
|
||||
else if (key[0] === '-') {
|
||||
const _key = key.slice(1)
|
||||
isKeyword = !(options.keywords.indexOf(_key) === -1)
|
||||
if (isKeyword) {
|
||||
key = _key
|
||||
isExclusion = true
|
||||
}
|
||||
}
|
||||
|
||||
// Check if the key is a registered range
|
||||
options.ranges = options.ranges || []
|
||||
const isRange = !(options.ranges.indexOf(key) === -1)
|
||||
// When the key matches a keyword
|
||||
if (isKeyword) {
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push({
|
||||
keyword: key,
|
||||
value: term.value,
|
||||
offsetStart: isExclusion ? term.offsetStart + 1 : term.offsetStart,
|
||||
offsetEnd: term.offsetEnd,
|
||||
})
|
||||
}
|
||||
|
||||
const value = term.value
|
||||
// When value is a thing
|
||||
if (value.length) {
|
||||
// Get an array of values when several are there
|
||||
const values = value.split(',')
|
||||
if (isExclusion) {
|
||||
if (exclusion[key]) {
|
||||
// ...many times...
|
||||
if (exclusion[key] instanceof Array) {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ... concatenate both arrays.
|
||||
exclusion[key] = exclusion[key].concat(values)
|
||||
}
|
||||
else {
|
||||
// ... append the current single value.
|
||||
exclusion[key].push(value)
|
||||
}
|
||||
}
|
||||
// We saw that keyword only once before
|
||||
else {
|
||||
// Put both the current value and the new
|
||||
// value in an array
|
||||
exclusion[key] = [exclusion[key]]
|
||||
exclusion[key].push(value)
|
||||
}
|
||||
}
|
||||
// First time we see that keyword
|
||||
else {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ...add all values seen.
|
||||
exclusion[key] = values
|
||||
}
|
||||
// Got only a single value this time
|
||||
else {
|
||||
// Record its value as a string
|
||||
if (options.alwaysArray) {
|
||||
// ...but we always return an array if option alwaysArray is true
|
||||
exclusion[key] = [value]
|
||||
}
|
||||
else {
|
||||
// Record its value as a string
|
||||
exclusion[key] = value
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
else {
|
||||
// If we already have seen that keyword...
|
||||
if (query[key]) {
|
||||
// ...many times...
|
||||
if (query[key] instanceof Array) {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ... concatenate both arrays.
|
||||
query[key] = query[key].concat(values)
|
||||
}
|
||||
else {
|
||||
// ... append the current single value.
|
||||
query[key].push(value)
|
||||
}
|
||||
}
|
||||
// We saw that keyword only once before
|
||||
else {
|
||||
// Put both the current value and the new
|
||||
// value in an array
|
||||
query[key] = [query[key]]
|
||||
query[key].push(value)
|
||||
}
|
||||
}
|
||||
// First time we see that keyword
|
||||
else {
|
||||
// ...and got several values this time...
|
||||
if (values.length > 1) {
|
||||
// ...add all values seen.
|
||||
query[key] = values
|
||||
}
|
||||
// Got only a single value this time
|
||||
else {
|
||||
if (options.alwaysArray) {
|
||||
// ...but we always return an array if option alwaysArray is true
|
||||
query[key] = [value]
|
||||
}
|
||||
else {
|
||||
// Record its value as a string
|
||||
query[key] = value
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
// The key allows a range
|
||||
else if (isRange) {
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push(term)
|
||||
}
|
||||
|
||||
const value = term.value
|
||||
// Range are separated with a dash
|
||||
const rangeValues = value.split('-')
|
||||
// When both end of the range are specified
|
||||
// keyword:XXXX-YYYY
|
||||
query[key] = {}
|
||||
if (rangeValues.length === 2) {
|
||||
query[key].from = rangeValues[0]
|
||||
query[key].to = rangeValues[1]
|
||||
}
|
||||
// When pairs of ranges are specified
|
||||
// keyword:XXXX-YYYY,AAAA-BBBB
|
||||
// else if (!rangeValues.length % 2) {
|
||||
// }
|
||||
// When only getting a single value,
|
||||
// or an odd number of values
|
||||
else {
|
||||
query[key].from = value
|
||||
}
|
||||
}
|
||||
else {
|
||||
// We add it as pure text
|
||||
const text = term.keyword + ':' + term.value
|
||||
query.text.push(text)
|
||||
|
||||
// When offsets is true, push a new offset
|
||||
if (options.offsets) {
|
||||
query.offsets.push({
|
||||
text: text,
|
||||
offsetStart: term.offsetStart,
|
||||
offsetEnd: term.offsetEnd,
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Return forged query object
|
||||
query.exclude = exclusion
|
||||
return query
|
||||
}
|
||||
@@ -22,6 +22,7 @@
|
||||
]
|
||||
},
|
||||
"include": [
|
||||
"**/*.ts"
|
||||
, "src/__tests__/event-bus-tests.mts" ]
|
||||
"**/*.ts",
|
||||
"src/__tests__/event-bus-tests.mts"
|
||||
]
|
||||
}
|
||||
@@ -19,5 +19,14 @@
|
||||
"1.1.0": "0.14.2",
|
||||
"1.1.1": "0.14.2",
|
||||
"1.2.0": "0.14.2",
|
||||
"1.2.1": "0.14.2"
|
||||
"1.2.1": "0.14.2",
|
||||
"1.3.0-beta": "0.14.2",
|
||||
"1.3.1-beta": "0.14.2",
|
||||
"1.3.2-beta": "0.14.2",
|
||||
"1.3.3-beta": "0.14.2",
|
||||
"1.3.3": "0.14.2",
|
||||
"1.3.4": "0.14.2",
|
||||
"1.3.5-beta1": "0.14.2",
|
||||
"1.3.5-beta2": "0.14.2",
|
||||
"1.3.5-beta3": "0.14.2"
|
||||
}
|
||||
Reference in New Issue
Block a user