How to improve compile time when I have a large HashMap literal?

I have a very large HashMap of physical dimensions (300k + records of three-element tuples) that I would like to save as a HashMap (I could move it to SQLite DB and query it, but I would probably not, for performance reasons). Including it as a literal makes compilation time ... long. Is there a better approach? Can I serialize it to disk in binary format and load it as a HashMap when downloading a binary file / library? Designing and testing using a subset works fine, but I need complete data for production ...

+4
source share
2 answers

So, do you hardcode the hash map? This seems like a Perfect Hashing problem, see the https://github.com/sfackler/rust-phf box.

As for compilation time, unload the hash table in a separate box, and Cargo will recompile this box only after changing the hash table data.

+4
source

If you are good at developing and testing with a smaller data set, you can use conditional compilation. Here is a simple example:

#[cfg(debug_assertions)]
const VALUE: u32 = 0;

#[cfg(not(debug_assertions))]
const VALUE: u32 = 1;

fn main() {
    println!("value: {}", VALUE);
}

If you compile it without optimization ("debug" mode), then it debug_assertionswill be true, but it VALUEwill be 0, but if you compile it with optimization ("release" mode), then it debug_assertionswill be false, and it VALUEwill be 1.

Cargo (cargo build cargo build --release), .

, . , bincode rustc_serialize:: json serde_json. , , , - .

+4

Source: https://habr.com/ru/post/1627474/


All Articles