Skip to content

Commit 26d0b59

Browse files
committed
Merge branch 'master' of https://github.com/kouak/dataloader into kouak-master
2 parents 6521f54 + 5ff6f86 commit 26d0b59

File tree

2 files changed

+76
-2
lines changed

2 files changed

+76
-2
lines changed

README.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -439,5 +439,3 @@ Looking to get started with a specific back-end? Try the [loaders in the example
439439
[cache algorithms]: https://en.wikipedia.org/wiki/Cache_algorithms
440440
[express]: http://expressjs.com/
441441
[babel/polyfill]: https://babeljs.io/docs/usage/polyfill/
442-
443-

examples/RethinkDB.md

Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,76 @@
1+
# RethinkDb
2+
3+
RethinkDb offers a batching method called `getAll` but there are a few caveats :
4+
* Order of results is not guaranteed ([rethinkdb/rethinkdb#5187](https://github.com/rethinkdb/rethinkdb/issues/5187))
5+
* Non-existent keys will not return and empty record
6+
Assuming a table `example_table` with those records :
7+
```js
8+
[
9+
{"id": 1, "name": "Document 1"},
10+
{"id": 2, "name": "Document 2"}
11+
]
12+
```
13+
A query `r.getAll(1, 2, 3)` could return such an array :
14+
```js
15+
[
16+
{"id": 2, "name": "Document 2"},
17+
{"id": 1, "name": "Document 1"}
18+
]
19+
```
20+
21+
In essence, this naive implementation won't work :
22+
```js
23+
var r = require('rethinkdb');
24+
var db = await r.connect();
25+
26+
var batchLoadFn = keys => db.table('example_table').getAll(...keys).then(res => res.toArray());
27+
var exampleLoader = new DataLoader(batchLoadFn);
28+
29+
await exampleLoader.loadMany([1, 2, 3]); // Throws, values length !== keys length
30+
31+
await exampleLoader.loadMany([1, 2]);
32+
await exampleLoader.load(1); // {"id": 2, "name": "Document 2"}
33+
```
34+
35+
A solution is to normalize results returned by `getAll` to match the structure of supplied `keys`.
36+
37+
To achieve this efficiently, we first write an indexing function. This function will return a Map indexing results.
38+
Parameters :
39+
* `results` : Array of RethinkDb results
40+
* `indexField` : String indicating which field was used as index for this batch query
41+
* `cacheKeyFn` : Optional function used to serialize non-scalar index field values
42+
```js
43+
function indexResults(results, indexField, cacheKeyFn = key => key) {
44+
var indexedResults = new Map();
45+
results.forEach(res => {
46+
indexedResults.set(cacheKeyFn(res[indexField]), res);
47+
});
48+
return indexedResults;
49+
}
50+
```
51+
Then, we can leverage our Map to normalize RethinkDb results with another utility function which will produce a normalizing function.
52+
```js
53+
function normalizeRethinkDbResults(keys, indexField, cacheKeyFn = key => key) {
54+
return results => {
55+
var indexedResults = indexResults(results, indexField, cacheKeyFn);
56+
return keys.map(val => indexedResults.get(cacheKeyFn(val)) || new Error(`Key not found : ${val}`));
57+
}
58+
}
59+
```
60+
61+
Full dataloader implementation :
62+
```js
63+
var r = require('rethinkdb');
64+
var db = await r.connect();
65+
66+
var batchLoadFn = keys => db.table('example_table')
67+
.getAll(...keys)
68+
.then(res => res.toArray())
69+
.then(normalizeRethinkDbResults(keys, 'id'));
70+
71+
var exampleLoader = new DataLoader(batchLoadFn);
72+
73+
await exampleLoader.loadMany([1, 2, 3]); // [{"id": 1, "name": "Document 1"}, {"id": 2, "name": "Document 2"}, Error];
74+
75+
await exampleLoader.load(1); // {"id": 1, "name": "Document 1"}
76+
```

0 commit comments

Comments
 (0)