A lightning-fast search API that fits effortlessly into your apps, websites, and workflow
Go to file
2018-12-31 19:55:18 +01:00
ci chore: Add travis-ci to check the codebase 2018-12-17 15:52:49 +01:00
examples chore: Rework the data module structures 2018-12-31 19:27:21 +01:00
misc doc: Update the deep-dive to talk about the key-value store 2018-12-10 00:33:14 +01:00
src test: Add more tests for updates ingestion 2018-12-31 19:27:21 +01:00
.gitignore chore: Improve the gitignore 2018-12-02 12:43:48 +01:00
.travis.yml chore: Add travis-ci to check the codebase 2018-12-17 15:52:49 +01:00
Cargo.toml feat: Prefer doing DatabaseView updates atomically 2018-12-29 20:52:00 +01:00
deep-dive.md chore: Rename the library "MeiliDB" 🎉 2018-12-10 00:41:31 +01:00
LICENSE Initial commit 2018-05-05 10:16:18 +02:00
README.md doc: Add some funny badges to the README 2018-12-19 12:00:29 +01:00

MeiliDB

Build Status dependency status License Rust 1.31+

A full-text search database using a key-value store internally.

It uses RocksDB like a classic database, to store documents and internal data. The key-value store power allow us to handle updates and queries with small memory and CPU overheads.

You can read the deep dive if you want more informations on the engine, it describes the whole process of generating updates and handling queries.

We will be proud if you send pull requests to help us grow this project, you can start with issues tagged "good-first-issue" to start !

At the moment this is a library only, this means that binaries are not part of this repository but since I'm still nice I have made some examples for you in the examples/ folder that works with the data located in the misc/ folder.

In a near future MeiliDB we be a binary like any database: updated and queried using some kind of protocol. It is the final goal, see the milestones. MeiliDB will just be a bunch of network and protocols functions wrapping the library which itself will be published to https://crates.io, following the same update cycle.

Performances

these informations have been made with a version dated of october 2018, we must update them

We made some tests on remote machines and found that we can handle with a dataset of near 280k products, on a server that cost 5$/month with 1vCPU and 1GB of ram and on the same index and with a simple query:

  • near 190 users with an average response time of 90ms
  • 150 users with an average response time of 70ms
  • 100 users with an average response time of 45ms

Network is mesured, servers are located in amsterdam and tests are made between two different datacenters.

Usage and examples

MeiliDB work with an index like most of the search engines. So to test the library you can create one by indexing a simple csv file.

cargo run --release --example create-database -- test.mdb misc/kaggle.csv

Once the command finished indexing the database should have been saved under the test.mdb folder.

Now you can easily run the query-database example to check what is stored in it.

cargo run --release --example query-database -- test.mdb