Skip to content

Commit

Permalink
- hGetCount()
Browse files Browse the repository at this point in the history
- `hGetFields()`
- `hGetvalues()`
  • Loading branch information
gerold-penz committed Aug 15, 2024
1 parent 9a49d20 commit a61b30b
Show file tree
Hide file tree
Showing 4 changed files with 208 additions and 25 deletions.
134 changes: 127 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,10 @@ The ideas for the implementation come from
- [`hGet()`](#hash-map-object---read-value)
- [`hmSet()`](#hash-map-object---write-multiple-values)
- [`hmGet()`](#hash-map-object---read-multiple-values)
- [`hExists()`](#hash-map-object---xxx)
- [`hHasField()`](#hash-map-object---has-field)
- [`hGetCount()`](#hash-map-object---count-fields)
- [`hGetFields()`](#hash-map-object---get-all-field-names)
- [`hGetValues()`](#hash-map-object---get-all-values)
- Extended database topics
- [Multiple Databases](#multiple-databases)
- [Database Transactions](#database-transactions)
Expand Down Expand Up @@ -932,9 +935,10 @@ is read from the database.
If the data record does not yet exist, a new "Map Object" is created.
Then the entry marked with `field` is added to the "Map Object" or overwritten.
Finally, the modified "Map Object" is written back to the database.

Inspired by: https://docs.keydb.dev/docs/commands/#hset

Do not use it with several large amounts of data or blobs.
Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Expand Down Expand Up @@ -990,12 +994,13 @@ First the
is read from the database.
If the data record (marked with `key`) does not exist, `undefined` is returned.
If the field (marked with `field`) does not exist in the "Map Object", `undefined` is returned.
Inspired by: https://docs.keydb.dev/docs/commands/#hget

Do not use it with several large amounts of data or blobs.
Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hget

### key

The key must be a string.
Expand Down Expand Up @@ -1027,6 +1032,12 @@ hmSet(key: string, fields: {[field: string]: T}, ttlMs?: number)
Like `hSet()`, with the difference that several fields
are written to the database in one go.

Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hmset

### key

The key must be a string.
Expand Down Expand Up @@ -1060,18 +1071,25 @@ store.hmSet("my-key", {
## Hash (Map Object) - Read Multiple Values

```typescript
hmGet(key: string, fields: fields: string[])
hmGet(key: string, fields: fields?: string[])
```

Like `hGet()`, with the difference that several fields are read in one go.

Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hmget

### key

The key must be a string.

### fields

Array with field names.
If the parameter is not specified, all fields are returned.

### Example

Expand Down Expand Up @@ -1099,8 +1117,11 @@ hHasField(key: string, field: string)
```

Returns if `field` is an existing field in the hash stored at `key`.
Do not use it with several large amounts of data or blobs.
This is because the entire data record with all fields is always read.

Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hexists

### key
Expand All @@ -1125,6 +1146,105 @@ store.hHasField("key-1", "field-1") // --> undefined
```


## Hash (Map Object) - Count Fields

```typescript
hGetCount(key: string)
```

Returns the number of fields contained in the hash stored at `key`.

Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hlen

### key

The key must be a string.

### Example

```typescript
import { BunSqliteKeyValue } from "bun-sqlite-key-value"
const store = new BunSqliteKeyValue()
store.hGetCount("key-1") // --> undefined
store.hSet("key-1", "field-1", "value-1")
store.hGetCount("key-1") // --> 1
```


## Hash (Map Object) - Get All Field Names

```typescript
hGetFields(key: string)
```

Returns the field names contained in the hash stored at `key`.
Use `hmGet()` to read field names and values.

Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hkeys

### key

The key must be a string.

### Example

```typescript
import { BunSqliteKeyValue } from "bun-sqlite-key-value"
const store = new BunSqliteKeyValue()
store.hmSet("key-1", {
"field-1": "value-1",
"field-2": "value-2"
})
store.hGetFields("key-1") // --> ["field-1", "field-2"]
```


## Hash (Map Object) - Get All Values

```typescript
hGetValues(key: string)
```

Returns the values contained in the hash stored at `key`.
Use `hmGet()` to read field names and values.

Do not use the hash functions with several very large amounts of data or blobs.
This is because the entire data record with all fields is always read and written.
It is better to use `setValues()` and `getValues()` for large amounts of data.

Inspired by: https://docs.keydb.dev/docs/commands/#hvals

### key

The key must be a string.

### Example

```typescript
import { BunSqliteKeyValue } from "bun-sqlite-key-value"
const store = new BunSqliteKeyValue()
store.hmSet("key-1", {
"field-1": "value-1",
"field-2": "value-2"
})
store.hGetValues("key-1") // --> ["value-1", "value-2"]
```


## Multiple Databases

It is no problem at all to use several databases and access them at the same time.
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "bun-sqlite-key-value",
"version": "1.10.7",
"version": "1.10.8",
"author": {
"name": "Gerold Penz",
"email": "[email protected]",
Expand Down
54 changes: 39 additions & 15 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -639,7 +639,7 @@ export class BunSqliteKeyValue {
}


// Do not use it with several large amounts of data or blobs.
// Do not use it with several very large amounts of data or blobs.
// This is because the entire data record with all fields is always read and written.
// Inspired by: https://docs.keydb.dev/docs/commands/#hset
hSet<T = any>(key: string, field: string, value: T, ttlMs?: number): boolean {
Expand All @@ -654,7 +654,7 @@ export class BunSqliteKeyValue {
}


// Do not use it with several large amounts of data or blobs.
// Do not use it with several very large amounts of data or blobs.
// This is because the entire data record with all fields is always read and written.
// Inspired by: https://docs.keydb.dev/docs/commands/#hget
hGet<T = any>(key: string, field: string): T | undefined {
Expand All @@ -664,7 +664,7 @@ export class BunSqliteKeyValue {
}


// Do not use it with several large amounts of data or blobs.
// Do not use it with several very large amounts of data or blobs.
// This is because the entire data record with all fields is always read and written.
// Inspired by: https://docs.keydb.dev/docs/commands/#hmset
hmSet<T = any>(key: string, fields: {[field: string]: T}, ttlMs?: number) {
Expand All @@ -678,22 +678,26 @@ export class BunSqliteKeyValue {
}


// Do not use it with several large amounts of data or blobs.
// Do not use it with several very large amounts of data or blobs.
// This is because the entire data record with all fields is always read and written.
// Inspired by: https://docs.keydb.dev/docs/commands/#hmget
hmGet<T = any>(key: string, fields: string[]): {[field: string]: T | undefined} | undefined {
hmGet<T = any>(key: string, fields?: string[]): {[field: string]: T | undefined} | undefined {
const map = this.get<Map<string, T>>(key)
if (map === undefined) return
const result: {[field: string]: T | undefined} = {}
fields.forEach((field) => {
result[field] = map.get(field)
})
if (fields) {
fields.forEach((field) => {
result[field] = map.get(field)
})
} else {
Object.assign(result, Object.fromEntries(map.entries()))
}
return result
}


// Returns if `field` is an existing field in the hash stored at `key`.
// Do not use it with several large amounts of data or blobs.
// Do not use it with several very large amounts of data or blobs.
// This is because the entire data record with all fields is always read.
// Inspired by: https://docs.keydb.dev/docs/commands/#hexists
hHasField(key: string, field: string): boolean | undefined {
Expand All @@ -707,22 +711,42 @@ export class BunSqliteKeyValue {
hExists = this.hHasField


// ToDo: hLen()
// Inspired by: https://docs.keydb.dev/docs/commands/#hlen
hGetCount(key: string): number | undefined {
const map = this.get<Map<string, any>>(key)
if (map === undefined) return
return map.size
}


// Alias for hGetCount()
hLen = this.hGetCount


// ToDo: hKeys()
// Inspired by: https://docs.keydb.dev/docs/commands/#hkeys
hGetFields(key: string): string[] | undefined {
const map = this.get<Map<string, any>>(key)
if (map === undefined) return
return [...map.keys()]
}


// ToDo: hStrLen()
// Inspired by: https://docs.keydb.dev/docs/commands/#hstrlen
// Alias for hGetFields()
hKeys = this.hGetFields


// ToDo: hVals()
// Do not use it with several large amounts of data or blobs.
// Do not use it with several very large amounts of data or blobs.
// This is because the entire data record with all fields is always read and written.
// Inspired by: https://docs.keydb.dev/docs/commands/#hvals
hGetValues<T = any>(key: string): T[] | undefined {
const map = this.get<Map<string, T>>(key)
if (map === undefined) return
return [...map.values()]
}


// Alias for hGetValues()
hVals = this.hGetValues


// ToDo: hDel()
Expand Down
43 changes: 41 additions & 2 deletions tests/memory.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -648,19 +648,58 @@ test("hmSet(), hmGet()", async () => {
})

// Get multiple fields
expect(store.hmGet(KEY_2, ["field-1", "field-2"]))
expect(store.hmGet(KEY_2, ["field-1"])).toBeUndefined()
const result = store.hmGet(KEY_1, ["field-1", "field-100"])
expect(result?.["field-1"]).toEqual("value-1")
expect(result?.["field-100"]).toBeUndefined()

// Get all fields
expect(Object.keys(store.hmGet(KEY_1)!).length).toEqual(2)
})


test("hHasField()", async () => {
const store = new BunSqliteKeyValue()

// Set multiple fields
store.hSet(KEY_1, "field-1", "value-1")

expect(store.hHasField(KEY_1, "field-1")).toBeTrue()
expect(store.hExists(KEY_2, "field-1")).toBeUndefined()
})


test("hGetCount()", async () => {
const store = new BunSqliteKeyValue()

expect(store.hGetCount(KEY_1)).toBeUndefined()

store.set(KEY_1, "value-1")
expect(store.hGetCount(KEY_1)).toBeUndefined()

store.hSet(KEY_2, "field-1", "value-1")
expect(store.hLen(KEY_2)).toEqual(1)
})


test("hGetFields()", async () => {
const store = new BunSqliteKeyValue()

store.hmSet(KEY_1, {
"field-1": "value-1",
"field-2": "value-2"
})
expect(store.hGetFields(KEY_1)).toEqual(["field-1", "field-2"])
expect(store.hKeys(KEY_1)).toBeArrayOfSize(2)
})


test("hGetValues()", async () => {
const store = new BunSqliteKeyValue()

store.hmSet(KEY_1, {
"field-1": "value-1",
"field-2": "value-2"
})
expect(store.hGetValues(KEY_1)).toEqual(["value-1", "value-2"])
expect(store.hVals(KEY_1)).toBeArrayOfSize(2)
})

0 comments on commit a61b30b

Please sign in to comment.