public class DirectoryCrawler extends Object implements DataProvider
This class handles data files recursively starting from a root directories tree. The organization of files in the directories is free. There may be sub-directories to any level. All sub-directories are browsed and all terminal files are checked for loading.
Gzip-compressed files are supported.
Zip archives entries are supported recursively.
This is a simple application of the visitor
design pattern for directory hierarchy crawling.
DataProvidersManager
,
Serialized FormGZIP_FILE_PATTERN, ZIP_ARCHIVE_PATTERN
Constructor and Description |
---|
DirectoryCrawler(File rootIn)
Build a data files crawler.
|
Modifier and Type | Method and Description |
---|---|
boolean |
feed(Pattern supported,
DataLoader visitor)
Feed a data file loader by browsing the data collection.
|
public DirectoryCrawler(File rootIn) throws PatriusException
rootIn
- root of the directories tree (must be a directory)PatriusException
- if root is not a directorypublic boolean feed(Pattern supported, DataLoader visitor) throws PatriusException
The method crawls all files referenced in the instance (for example all files in a directories tree) and for each file supported by the file loader it asks the file loader to load it.
If the method completes without exception, then the data loader is considered to have been fed successfully and
the top level data providers manager
will return immediately without attempting to
use the next configured providers.
If the method completes abruptly with an exception, then the top level data providers
manager
will try to use the next configured providers, in case another one can feed the data
loader
.
feed
in interface DataProvider
supported
- pattern for file names supported by the visitorvisitor
- data file visitor to usePatriusException
- if the data loader cannot be fed
(read error ...)Copyright © 2023 CNES. All rights reserved.