class torchdata.datapipes.iter.AISFileLoader(source_datapipe: IterDataPipe[str], url: str, length: int = -1)

Iterable DataPipe that loads files from AIStore with the given URLs (functional name: load_files_by_ais). Iterates all files in BytesIO format and returns a tuple (url, BytesIO).

Note: - This function also supports files from multiple backends (aws://.., gcp://.., azure://.., etc) - Input must be a list and direct URLs are not supported. - This internally uses AIStore Python SDK.

  • source_datapipe (IterDataPipe[str]) – a DataPipe that contains URLs/URL prefixes to objects

  • url (str) – AIStore endpoint

  • length (int) – length of the datapipe


>>> from torchdata.datapipes.iter import IterableWrapper, AISFileLister,AISFileLoader
>>> ais_prefixes = IterableWrapper(['gcp://bucket-name/folder/', 'aws:bucket-name/folder/', 'ais://bucket-name/folder/', ...])
>>> dp_ais_urls = AISFileLister(url='localhost:8080', source_datapipe=ais_prefixes)
>>> dp_cloud_files = AISFileLoader(url='localhost:8080', source_datapipe=dp_ais_urls)
>>> for url, file in dp_cloud_files:
...     pass
>>> # Functional API
>>> dp_cloud_files = dp_ais_urls.load_files_by_ais(url='localhost:8080')
>>> for url, file in dp_cloud_files:
...     pass


Access comprehensive developer documentation for PyTorch

View Docs


Get in-depth tutorials for beginners and advanced developers

View Tutorials


Find development resources and get your questions answered

View Resources