BulkInferrer

E457350

BulkInferrer is a TensorFlow Extended (TFX) component used to run large-scale batch inference with trained machine learning models on sizable datasets.

Try in SPARQL Jump to: Statements Referenced by

Statements (44)

Predicate Object
instanceOf TFX component
batch inference component
abbreviationOf TFX BulkInferrer NERFINISHED
canUse TFX Transform outputs
TransformGraph artifact
category MLOps NERFINISHED
machine learning infrastructure
compatibleWith Apache Airflow NERFINISHED
Kubeflow Pipelines NERFINISHED
TFX pipelines NERFINISHED
Vertex AI Pipelines NERFINISHED
configuredBy BulkInferrerSpec NERFINISHED
designedFor production ML workflows
scalable inference
developedBy Google NERFINISHED
documentationUrl https://www.tensorflow.org/tfx/guide/bulkinferrer
follows Trainer component
hasProperty deterministic batch prediction
non-serving, offline inference
implementedIn Python NERFINISHED
inputType TFRecord files via Examples artifact
integratesWith TFX metadata NERFINISHED
TFX orchestration NERFINISHED
license Apache License 2.0
operatesOn Examples artifact
TFX Example artifacts
outputType TFRecord files with predictions
partOf TensorFlow Extended NERFINISHED
precedes model evaluation in some pipelines
produces PredictionResults artifact
inference results
predictions
requires Model artifact
trained model
softwareLibrary TensorFlow Extended NERFINISHED
supports SavedModel format
TensorFlow models
data parallelism
distributed processing
usedFor generating predictions on sizable datasets
large-scale batch inference
offline prediction generation
running inference with trained machine learning models
scoring examples with a trained model

Referenced by (1)

Full triples — surface form annotated when it differs from this entity's canonical label.

TensorFlow Extended hasComponent BulkInferrer