Attention
June 2024 Status Update: Removing DataPipes and DataLoader V2
We are re-focusing the torchdata repo to be an iterative enhancement of torch.utils.data.DataLoader. We do not plan on continuing development or maintaining the [DataPipes] and [DataLoaderV2] solutions, and they will be removed from the torchdata repo. We’ll also be revisiting the DataPipes references in pytorch/pytorch. In release torchdata==0.8.0 (July 2024) they will be marked as deprecated, and in 0.9.0 (Oct 2024) they will be deleted. Existing users are advised to pin to torchdata==0.8.0 or an older version until they are able to migrate away. Subsequent releases will not include DataPipes or DataLoaderV2. Please reach out if you suggestions or comments (please use this issue for feedback)
AISFileLoader¶
- class torchdata.datapipes.iter.AISFileLoader(source_datapipe: IterDataPipe[str], url: str, length: int = - 1)¶
Iterable DataPipe that loads files from AIStore with the given URLs (functional name:
load_files_by_ais
). Iterates all files in BytesIO format and returns a tuple (url, BytesIO).Note: - This function also supports files from multiple backends (aws://.., gcp://.., azure://.., etc) - Input must be a list and direct URLs are not supported. - This internally uses AIStore Python SDK.
- Parameters:
source_datapipe (IterDataPipe[str]) – a DataPipe that contains URLs/URL prefixes to objects
url (str) – AIStore endpoint
length (int) – length of the datapipe
Example
>>> from torchdata.datapipes.iter import IterableWrapper, AISFileLister,AISFileLoader >>> ais_prefixes = IterableWrapper(['gcp://bucket-name/folder/', 'aws:bucket-name/folder/', 'ais://bucket-name/folder/', ...]) >>> dp_ais_urls = AISFileLister(url='localhost:8080', source_datapipe=ais_prefixes) >>> dp_cloud_files = AISFileLoader(url='localhost:8080', source_datapipe=dp_ais_urls) >>> for url, file in dp_cloud_files: ... pass >>> # Functional API >>> dp_cloud_files = dp_ais_urls.load_files_by_ais(url='localhost:8080') >>> for url, file in dp_cloud_files: ... pass