CachePlain
ClassSource
cache = CachePlain[Info, Data](cache_size: int)
The plain implementation of the cache, where Info
and Data
are generic types.
In this project, Info
and Data
are configured as
CachedFileInfo
and
CachedData
, respectively.
This CachePlain
instance can only share the cached data among threads. It can
be used in normal callbacks. However, in background callbacks, any modifications
onCachePlain
will be aborted when the callback gets finalized.
Manipulation of this cache is threading-safe.
Aliases
This class can be acquired by
import dash_file_cache as dfc
dfc.CachePlain
dfc.caches.CachePlain
dfc.caches.memory.CachePlain
Arguments
Argument | Type | Required | |
---|---|---|---|
cache_size | int | The size of the cache, i.e. the maximum of items in the cache. When the cache is full, adding more data in the cache will cause the eldest data to quit from the cache. |
Methods
is_in
flag: bool = cache.is_in(key: str)
Check whether key
is registered in the cache.
The __contains__
operator is delegated to this method.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The keyword to be validated. |
Returns
Argument | Type | |
---|---|---|
flag | bool | If True , the given key value exists in the cache. |
remove
info: Info = cache.remove(key: str)
Remove the info of one cached item from this cache.
Using this method implies that the data in the cache reaches its end of life. Only the cache item information will still be usable.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The keyword of the data to be removed. |
Returns
Argument | Type | |
---|---|---|
info | Info | The light-weighted metadata queried in the cache. |
dump
cache.dump(key: str, info: Info, data: Data)
Dump data to the cache.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of this new data. If the key exists in the cache, will replace the original value. | |
info | Info | The light-weighted metadata to be dumped into the cache. | |
data | Data | The data to be dumped into the cache. |
load
info, data_loader = cache.load(key: str)
Load the data by a specific keyword.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of to be queried. If key does not exist in the cache, will raise a FileNotFoundError . |
Returns
Argument | Type | |
---|---|---|
info | Info | The light-weighted metadata queried in the cache. |
data_loader | Callable[[], Data] | The lazy loader for the data. This function implements a deferred loading mechanism allowing that the large-size data to be actually loaded when this method is called. |
load_info
info: Info = cache.load_info(key: str)
Load the metadata by a specific keyword.
This method is implemented by fetching the returned value load(key)[0]
.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of to be queried. If key does not exist in the cache, will raise a FileNotFoundError . |
Returns
Argument | Type | |
---|---|---|
info | Info | The light-weighted metadata queried in the cache. |
load_data
data: Data = cache.load_data(key: str)
Load the data by a specific keyword.
This method is implemented by fetching and calling load(key)[1]()
.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of to be queried. If key does not exist in the cache, will raise a FileNotFoundError . |
Returns
Argument | Type | |
---|---|---|
data | Data | The data object queried in the cache. Since data may be large, this value should be a file-like object in most cases. |
Properties
cache
cache_dict: LRUDict[str, Tuple[Info, Data]] = cache.cache
Get the low-level LRU cache object of this instance. This value is an LRUDict
.
Operators
__contains__
flag: bool = key in cache
Check whether key
is registered in the cache.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The keyword to be validated. |
Returns
Argument | Type | |
---|---|---|
flag | bool | If True , the given key value exists in the cache. |
Examples
See