Best way for an internal search with a lot of data

Hello dear community,
I have about 20,000 database entries in my app, with location data and some additions (about 6 data sets per entry).

Now I would like to implement a simple search function, I have considered the following approaches:

Version 1:
I write the data in my view into a variable (JSON) and pass it to my template (context) - here I write it into a JavaScript variable. I then react to the form input and build my HTML elements with JavaScript with the respective content and play it out to the user.

From my perspective the simplest solution, but I can’t estimate how much the performance is negatively affected if I write the data to a variable with so much data:
let data = JSON.parse(‘{{ data|escapejs }}’)

Variant 2:
When entering at least three letters, I make an AJAX call and look for the matching results in the database - and then use JavaScript to write the HTML elements with the respective content and play it out to the user.

Here I cannot estimate how heavy the load is, insofar as the user makes dozens of entries.

Variant 3:
I create the HTML directly in the template in a loop. I’ve already tried that, but the page is incredibly slow because the DOM is incredibly huge.

Which variant would you choose - are there any better approaches?

What are you looking to do with the results of this search? What are you seaching for? Is the data you’re searching for in one field or (possibly) multiple fields?

The common flow is that the search terms are entered into a form, the form is submitted to a view. The view performs the search and renders the response with the found data.

In general terms, this type of search is a very common and routine operation, and 20,000 entries is rather small in the grand scheme of things.

Yes, 20k entries is tiny.

Although even then, your idea of running the search client-side is not necessarily a good one.

Let’s imagine your scenario with 20k entries (each entry is ~500 bytes):
. 10k downloaded to the client (1-2KB gzipped/brotlied)
. 20k entries parsed client-side via javascript. Fine, this should be quick.

Same scenario, with 200,000 entries:
. 100k download.
. 200,000k parsing/searching.
. Your user starts noticing some slugishness.

2 million entries:
. 1MB download (the size of a small video).
. a loop with 2 million iterations. Assuming usage of indexOf and regex pattern matching, it now starts to feel much slower.

And it gets worse. 2 million entries is really not that much.

So your variant 2 is, I believe, the most viable one. But again, you are the only one who knows how large your dataset will grow.

Have you thought about server-side searches? Depending on your back-end, this could be far more efficient (whether you use mysql/postgre or something more specialised such as solr or elasticsearch…).

Long story short : )-- whatever you choose, to do, I believe variant 2 to be the best for the medium/long-ish term:
. user starts with empty page
. user types in query
. query gets sent via xhr/fetch. client deactivates search field (to avoid multiple crazy hits – pair with rate limiting on server, for example)
. server gets query
. server hits the (nicely indexed) database with query
. server fetchs the data from db and generates json/markup
. server returns response
. client receives response, renders it.
. rinse
. repeat

oh and do use debounce techniques client-side, or trigger the search only on hitting “submit”.

As a supplemental note, if you do go with your #2, take a look at Django-Select2 — Django-Select2 8.0.0 documentation.