

#Spec method map reading how to
Theįollowing example illustrates how to use the reduce transformation to compute Transformation, which reduces all elements to produce a single result. Or by explicitly creating a Python iterator using iter and consuming itsĪlternatively, dataset elements can be consumed using the reduce This makes it possible to consume itsĮlements using a for loop: dataset = tf._tensor_slices() Tf.data.Dataset for a complete list of transformations. For example, you canĪpply per-element transformations such as Dataset.map, and multi-element Once you have a Dataset object, you can transform it into a new Dataset byĬhaining method calls on the tf.data.Dataset object.

TFRecord format, you can use tf.data.TFRecordDataset(). Tf._tensors() or tf._tensor_slices().Īlternatively, if your input data is stored in a file in the recommended To construct a Dataset from data in memory, you can use To create an input pipeline, you must start with a data source. There are two distinct ways to create a dataset:Ī data source constructs a Dataset from data stored in memory or inĪ data transformation constructs a dataset from one or more Sequence of elements, in which each element consists of one or more components.įor example, in an image pipeline, an element might be a single trainingĮxample, with a pair of tensor components representing the image and its label. The tf.data API introduces a tf.data.Dataset abstraction that represents a Handle large amounts of data, read from different data formats, and perform Pipeline for a text model might involve extracting symbols from raw text data,Ĭonverting them to embedding identifiers with a lookup table, and batching Image, and merge randomly selected images into a batch for training. For example, the pipeline for an image model might aggregateĭata from files in a distributed file system, apply random perturbations to each The tf.data API enables you to build complex input pipelines from simple,
