dnikit_tensorflow#
TensorFlow extensions of DNIKit.
- class dnikit_tensorflow.TFDatasetExamples[source]#
Bases:
objectExample TF Datasets, each bundled as a DNIKit
Producer. Loaded from tf.keras.dataset.- class CIFAR10(split_dataset=None, attach_metadata=True, max_samples=-1)#
Bases:
_KerasDatasetWithStrLabelsThe CIFAR10 dataset, loaded from tf.keras.dataset, that produces
Batches.- Metadata labels for this dataset:
['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
- Parameters:
split_dataset (Tuple[Tuple[ndarray, ndarray], Tuple[ndarray, ndarray]]) –
[optional] It is unlikely this parameter will be overridden. This is the dataset as defined by
(x_train, y_train), (x_test, y_test), but by default, is set up to load the CIFAR10 dataset from tf.keras.datasetattach_metadata (bool) – [optional] attach
metadata <dnikit.base.Batch.metadatatoBatchproduced by thisProducer, under metadata keyBatch.StdKeys.LABELSmax_samples (int) – [optional] number of samples this
Producershould yield (helpful for testing pipelines with a small number of data samples)
- class CIFAR100(split_dataset=None, attach_metadata=True, max_samples=-1, label_mode='fine')#
Bases:
_KerasDatasetWithStrLabelsLoad CIFAR100 dataset, loaded from tf.keras.dataset, that produces
Batches.- Metadata labels for this dataset (
fine): ['apple', 'aquarium_fish', 'baby', 'bear', 'beaver', 'bed', 'bee', 'beetle', 'bicycle', 'bottle', 'bowl', 'boy', 'bridge', 'bus', 'butterfly', 'camel', 'can', 'castle', 'caterpillar', 'cattle', 'chair', 'chimpanzee', 'clock', 'cloud', 'cockroach', 'couch', 'crab', 'crocodile', 'cup', 'dinosaur', 'dolphin', 'elephant', 'flatfish', 'forest', 'fox', 'girl', 'hamster', 'house', 'kangaroo', 'keyboard', 'lamp', 'lawn_mower', 'leopard', 'lion', 'lizard', 'lobster', 'man', 'maple_tree', 'motorcycle', 'mountain', 'mouse', 'mushroom', 'oak_tree', 'orange', 'orchid', 'otter', 'palm_tree', 'pear', 'pickup_truck', 'pine_tree', 'plain', 'plate', 'poppy', 'porcupine', 'possum', 'rabbit', 'raccoon', 'ray', 'road', 'rocket', 'rose', 'sea', 'seal', 'shark', 'shrew', 'skunk', 'skyscraper', 'snail', 'snake', 'spider', 'squirrel', 'streetcar', 'sunflower', 'sweet_pepper', 'table', 'tank', 'telephone', 'television', 'tiger', 'tractor', 'train', 'trout', 'tulip', 'turtle', 'wardrobe', 'whale', 'willow_tree', 'wolf', 'woman', 'worm']- Metadata labels for this dataset (
coarse): ['aquatic_mammals', 'fish', 'flowers', 'food_containers', 'fruit_and_vegetables', 'household_electrical_devices', 'household_furniture', 'insects', 'large_carnivores', 'large_man-made_outdoor_things', 'large_natural_outdoor_scenes', 'large_omnivores_and_herbivores', 'medium_mammals', 'non-insect_invertebrates', 'people', 'reptiles', 'small_mammals', 'trees', 'vehicles_1', 'vehicles_2']
- Parameters:
split_dataset (Tuple[Tuple[ndarray, ndarray], Tuple[ndarray, ndarray]]) –
[optional] It is unlikely this parameter will be overridden. This is the dataset as defined by
(x_train, y_train), (x_test, y_test), but by default, is set up to load the CIFAR100 dataset from tf.keras.datasetattach_metadata (bool) – [optional] attach
metadatatoBatchproduced by thisProducer, under metadata keyBatch.StdKeys.LABELSmax_samples (int) – [optional] number of samples this
Producershould yield (helpful for testing pipelines with a small number of data samples)label_mode (str) – [optional] either
fineorcoarseto determine granularity ofmetadatalabels (see tf.keras.dataset documentation)
- Metadata labels for this dataset (
- class FashionMNIST(split_dataset=None, attach_metadata=True, max_samples=-1)#
Bases:
_KerasDatasetWithStrLabelsLoad FashionMNIST dataset, loaded from tf.keras.dataset, that produces
Batches.- Metadata labels for this dataset:
['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat', 'Sandal', 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']
- Parameters:
split_dataset (Tuple[Tuple[ndarray, ndarray], Tuple[ndarray, ndarray]]) –
[optional] It is unlikely this parameter will be overridden. This is the dataset as defined by
(x_train, y_train), (x_test, y_test), but by default, is set up to load the FashionMNIST dataset from tf.keras.datasetattach_metadata (bool) – [optional] attach
metadatatoBatchproduced by thisProducer, under metadata keyBatch.StdKeys.LABELSmax_samples (int) – [optional] number of samples this
Producershould yield (helpful for testing pipelines with a small number of data samples)
- class MNIST(split_dataset=None, attach_metadata=True, max_samples=-1)#
Bases:
_KerasDatasetLoaderLoad MNIST dataset, loaded from tf.keras.dataset, that produces
Batches.- Metadata labels for this dataset:
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
- Parameters:
split_dataset (Tuple[Tuple[ndarray, ndarray], Tuple[ndarray, ndarray]]) –
[optional] It is unlikely this parameter will be overridden. This is the dataset as defined by
(x_train, y_train), (x_test, y_test), but by default, is set up to load the MNIST dataset from tf.keras.datasetattach_metadata (bool) – [optional] attach
metadatatoBatchproduced by thisProducer, under metadata keyBatch.StdKeys.LABELSmax_samples (int) – [optional] number of samples this
Producershould yield (helpful for testing pipelines with a small number of data samples)
- class dnikit_tensorflow.TFModelExamples[source]#
Bases:
objectOut-of-the-box TF and Keras models with pre- and post-processing.
- MobileNet()#
Load the MobileNet model and processing stages from Keras into DNIKit.
- class dnikit_tensorflow.TFModelWrapper(model, preprocessing=None, postprocessing=None)[source]#
Bases:
objectA wrapper for loading TensorFlow models into DNIKit
ModelsandPipelineStages, with their pre- and post-processing functions built-in.- Parameters:
preprocessing (None | PipelineStage | Collection[PipelineStage]) – see
preprocessingpostprocessing (None | PipelineStage | Collection[PipelineStage]) – see
postprocessing
- __call__(requested_responses=None)[source]#
Generate a
PipelineStagethat preprocessesBatchesfor theModel, runs the model with the requested responses, and postprocesses responses before returning them.Note
If the instance’s
postprocessingorpreprocessingproperties are None, it will ignore those steps`.- Parameters:
requested_responses (None | str | Collection[str]) – passed to the DNIKit
Model. Determines which outputs from the model will be present in theBatchoutput by the resultingPipelineStage.- Returns:
a single
PipelineStageor list ofPipelineStages- Return type:
PipelineStage | Collection[PipelineStage] | Sequence[PipelineStage | Collection[PipelineStage]]
- classmethod from_keras(model, preprocessing)[source]#
Convenience method for loading as
TFModelWrapperfrom Keras models and preprocessors.Note
When subclassing
TFModelWrapperand there are additional pre-postprocessing steps to run outside of Keras’s preprocessing, modify the respective attribute of the return object to add those steps asPipelineStages.- Parameters:
- Return type:
- static load_keras_model(model)[source]#
Saves TF Keras model to disk and reloads it as a DNIKit
Model.- Parameters:
model (Model) – TF Keras model
- Return type:
- postprocessing: None | PipelineStage | Collection[PipelineStage] = None#
One or many DNIKit
PipelineStagesfor post-processingbatchesafter model output
- preprocessing: None | PipelineStage | Collection[PipelineStage] = None#
One or many DNIKit
PipelineStagesfor pre-processingbatchesfor this model
- property response_infos: Mapping[str, ResponseInfo]#
Get all possible responses in a model. Result is returned as a mapping between response names and the corresponding
ResponseInfo.
- dnikit_tensorflow.load_tf_model_from_memory(*, session=None, model=None)[source]#
Initialize a TensorFlow
Modelfrom a model loaded inmemory. This function is supported for both TF2 and TF1, but different parameters are required. For TF2, only pass parametermodel. For TF1, only pass parametersession.
- dnikit_tensorflow.load_tf_model_from_path(path)[source]#
Initialize a TensorFlow
Modelfrom a model serialized inpathCurrently accepted serialized model formats, depending on if TF 1 or TF 2 is running.
- TF2 Supported formats:
TensorFlow Keras SavedModel
Keras whole models (h5)
Keras models with separate architecture and weights files
- TF1 Supported formats:
TensorFlow SavedModel
TensorFlow checkpoint (pass checkpoint prefix as
pathparam)TensorFlow protobuf
Keras whole models
Keras models with separate architecture and weights files
Note
The keras loaders are currently using
tf.kerasinstead ofkerasnatively, and so issues might appear when trying to load models saved with nativekeras(not tf.keras). In this case, load the model outside of DNIKit withkerasand pass it to load withload_tf_model_from_memory.