CacheQueue
ClassSource
cache = CacheQueue[Info, Data](
cache_size: int,
qobj: queue.Queue | Callable[[], queue.Queue] | None = None
)
The cache implementation based on process-sharable
Queue()
.
Note that a threading queue can be also used here. But it is recommended to use
multiprocessing.get_context(...).Manager().Queue()
.
This CacheQueue
instance is designed for caching the data among processes. This
cache can be accessed by the background callbacks.
Any instance of this class will create a daemon thread if it is in the main process. This daemon thread keeps listening to the queue.
Manipulation of this cache is processing-safe.
Aliases
This class can be acquired by
import dash_file_cache as dfc
dfc.CacheQueue
dfc.caches.CacheQueue
dfc.caches.memory.CacheQueue
Arguments
Argument | Type | Required | |
---|---|---|---|
cache_size | int | The size of the cache, i.e. the maximum of items in the cache. When the cache is full, adding more data in the cache will cause the eldest data to quit from the cache. | |
qobj | queue.Queue | Callable[[], queue.Queue] | None | The queue object provided by either of queue.Queue() or the multiprocessing.get_context(...).Manager().Queue() .It is used for synchronize the data from the sub-processes to the main process. If this value is None , it means that the property CacheQueue().qobj will be implemented later. This property needs to be configured before the first time when this cache is used.If this value is a function returning a queue. It means that the function will be used for deferred loading. In other words, the queue will be initialized when it is actually used for the first time. |
Methods
is_in
flag: bool = cache.is_in(key: str)
Check whether key
is registered in the cache.
The __contains__
operator is delegated to this method.
Please only use this method in the main process. It will not work in any subprocess.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The keyword to be validated. |
Returns
Argument | Type | |
---|---|---|
flag | bool | If True , the given key value exists in the cache. |
remove
info: Info = cache.remove(key: str)
Remove the info of one cached item from this cache.
Using this method implies that the data in the cache reaches its end of life. Only the cache item information will still be usable.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The keyword of the data to be removed. |
Returns
Argument | Type | |
---|---|---|
info | Info | The light-weighted metadata queried in the cache. |
dump
cache.dump(key: str, info: Info, data: Data)
Dump data to the cache.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of this new data. If the key exists in the cache, will replace the original value. | |
info | Info | The light-weighted metadata to be dumped into the cache. | |
data | Data | The data to be dumped into the cache. |
load
info, data_loader = cache.load(key: str)
Load the data by a specific keyword.
Please only use this method in the main process. It will not work in any subprocess.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of to be queried. If key does not exist in the cache, will raise a FileNotFoundError . |
Returns
Argument | Type | |
---|---|---|
info | Info | The light-weighted metadata queried in the cache. |
data_loader | Callable[[], Data] | The lazy loader for the data. This function implements a deferred loading mechanism allowing that the large-size data to be actually loaded when this method is called. |
load_info
info: Info = cache.load_info(key: str)
Load the metadata by a specific keyword.
This method is implemented by fetching the returned value load(key)[0]
.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of to be queried. If key does not exist in the cache, will raise a FileNotFoundError . |
Returns
Argument | Type | |
---|---|---|
info | Info | The light-weighted metadata queried in the cache. |
load_data
data: Data = cache.load_data(key: str)
Load the data by a specific keyword.
This method is implemented by fetching and calling load(key)[1]()
.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The key value of to be queried. If key does not exist in the cache, will raise a FileNotFoundError . |
Returns
Argument | Type | |
---|---|---|
data | Data | The data object queried in the cache. Since data may be large, this value should be a file-like object in most cases. |
Properties
qobj
qobj: queue.Queue = cache.qobj
new_qobj: queue.Queue | None
cache.qobj = new_qobj
The queue object used for sharing data among processes.
Set this property with None
will not take any effect.
mirror
cache_mirror: CacheQueueMirror[Info, Data] = cache.mirror
The queue mirror. This value is a CacheQueueMirror
.
This method should be used when dump()
needs to be called in sub-processes.
cache
cache_dict: LRUDict[str, Tuple[Info, Data]] = cache.cache
Get the low-level LRU cache object of this instance. This value is an LRUDict
.
Note that this property is not accessible in a sub-process.
Operators
__contains__
flag: bool = key in cache
Check whether key
is registered in the cache.
Please only use this operator in the main process. It will not work in any subprocess.
Requires
Argument | Type | Required | |
---|---|---|---|
key | str | The keyword to be validated. |
Returns
Argument | Type | |
---|---|---|
flag | bool | If True , the given key value exists in the cache. |
Examples
See