当前位置:网站首页>AUTO sharding policy will apply DATA sharding policy as it failed to apply FILE sharding policy
AUTO sharding policy will apply DATA sharding policy as it failed to apply FILE sharding policy
2022-06-26 15:37:00 【there2belief】
Use tf.distribute.MirroredStrategy() when , A warning :
AUTO sharding policy will apply DATA sharding policy as it failed to apply FILE sharding policy because of the following reason: Did not find a shardable source, walked to a node which is not a dataset
The code is as follows :
strategy = tf.distribute.MirroredStrategy()
with strategy.scope():
keras_model = build_model()
train_datagen = ImageDataGenerator()
training_img_generator = train_datagen.flow_from_directory(
input_path,
target_size=(image_size, image_size),
batch_size=batch_size,
class_mode="categorical",
)
train_dataset = tf.data.Dataset.from_generator(
lambda: training_img_generator,
output_types=(tf.float32, tf.float32),
output_shapes=([None, image_size, image_size, 3], [None, len(image_classes)])
)
# similar for validation_dataset = ...
keras_model.fit(
train_dataset,
steps_per_epoch=train_steps_per_epoch,
epochs=epoch_count,
validation_data=validation_dataset,
validation_steps=validation_steps_per_epoch,
)Now this seem to work, the model is trained as usual. However, during training I get the following warning message, when using a mirrored strategy:
AUTO sharding policy will apply DATA sharding policy as it failed to apply FILE sharding policy because of the following reason: Did not find a shardable source, walked to a node which is not a dataset
Attempted to resolve but failed
So I added the following lines between creating the data sets and calling fit():
options = tf.data.Options()
options.experimental_distribute.auto_shard_policy = tf.data.experimental.AutoShardPolicy.DATA
train_dataset.with_options(options)
validation_dataset.with_options(options)However, I still get the same warning.
This leads me to these two questions:
- What do I need to do in order to get rid of this warning?
- Even more important: Why is TF not able to split the dataset with the default
AutoShardPolicy.FILEpolicy, since I am using thousands of images per class in the input folder?
边栏推荐
- Sorted out a batch of script standard function modules (version 2021)
- 安全Json协议
- 有Cmake的工程交叉编译到链接时报错找不到.so动态库文件
- [tcapulusdb knowledge base] tcapulusdb operation and maintenance doc introduction
- Comparative analysis of restcloud ETL and kettle
- [CEPH] cephfs internal implementation (IV): how is MDS started-- Undigested
- [tcapulusdb knowledge base] Introduction to tcapulusdb general documents
- 5 figures illustrate the container network
- Particle filter PF -- Application in maneuvering target tracking (particle filter vs extended Kalman filter)
- [tcapulusdb knowledge base] tcapulusdb doc acceptance - create business introduction
猜你喜欢

【SNMP】snmp trap 介绍、安装、命令|Trap的发送与接收代码实现

评价——模糊综合评价

vsomeip3 双机通信文件配置

【文件】VFS四大struct:file、dentry、inode、super_block 是什么?区别?关系?--编辑中
![[tcapulusdb knowledge base] tcapulusdb OMS business personnel permission introduction](/img/7b/8c4f1549054ee8c0184495d9e8e378.png)
[tcapulusdb knowledge base] tcapulusdb OMS business personnel permission introduction
![[wechat applet] event binding, do you understand?](/img/83/6242e972538d0423fd4155140bb521.png)
[wechat applet] event binding, do you understand?

【C语言练习——打印空心上三角及其变形】

sqlite加载csv文件,并做数据分析

Function: crypto JS encryption and decryption

评价——TOPSIS
随机推荐
2022北京石景山区专精特新中小企业申报流程,补贴10-20万
TCP/IP协议竟然有这么多漏洞?
5 figures illustrate the container network
Unity C # e-learning (10) -- unitywebrequest (2)
Seurat to h5ad summary
About selenium common. exceptions. Webdriverexception: message: an unknown server side error solution (resolved)
selenium将元素保存为图片
一键分析硬件/IO/全国网络性能脚本(强推)
Inaccurate data accuracy in ETL process
数据库-序列
[CEPH] cephfs internal implementation (I): Concept -- undigested
Lexin AWS IOT expresslink module achieves universal availability
English语法_形容词/副词3级 - 原级句型
【ceph】cephfs caps简介
Ansible自动化的运用
High frequency interview 𞓜 Flink Shuangliu join
面试高频 | 你追我赶的Flink双流join
Smoothing data using convolution
JS simple deepcopy (Introduction recursion)
Redis cluster