[MS_DEV_RUNTIME_CONF]Runtime config: compile_statistics:True [PROF]distributed_cluster_init costs 0.005 msec. [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:12:29.196.772 [mindspore/ccsrc/distributed/collective/collective_manager.cc:175] Initialize] This is simulation mode with level 1. Process's RANK_ID: 0, RANK_SIZE: 32 [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:12:29.196.820 [mindspore/ccsrc/distributed/collective/collective_manager.cc:254] InitializeDummyCommLib] Initializing dummy collective communication with rank size: 32, rank id: 0, local rank id: 0. Real rank size: 1. [MS_RUNTIME_PROF]The jit_level is: O1, and enable kernelbykernel executor in the GRAPH mode. [PROF]distributed_collective_init costs 0.162 msec. 2024-12-27 13:12:29,805 - mindformers[mindformers/version_control.py:102] - INFO - The Lazy Inline compilation acceleration feature only works in pipeline parallel mode (pipeline_stage > 1). Current pipeline stage=1, the feature is disabled by default. You can also enable lazy inline without pipeline parallel, by setting environment variable `export ENABLE_LAZY_INLINE_NO_PIPELINE=1`. [WARNING] ME(15286:281473333435408,MainProcess):2024-12-27-13:12:29.807.770 [mindspore/testcases/testcases/tests/st/networks/mindformers/mindformers/modules/transformer/op_parallel_config.py:268] The optimizer shard True in auto_parallel_context is not equal to the optimizer_shard None in the OpParallelConfig. Please check the optimizer_shard to make them consistent. [WARNING] ME(15286:281473333435408,MainProcess):2024-12-27-13:12:29.810.636 [mindspore/testcases/testcases/tests/st/networks/mindformers/mindformers/modules/transformer/op_parallel_config.py:268] The optimizer shard True in auto_parallel_context is not equal to the optimizer_shard None in the OpParallelConfig. Please check the optimizer_shard to make them consistent. 2024-12-27 13:12:29,811 - mindformers[mindformers/models/llama/llama.py:92] - INFO - enable asd op:False 2024-12-27 13:12:29,811 - mindformers[mindformers/models/llama/llama.py:96] - INFO - MoE config is None, use normal FFN 2024-12-27 13:12:29,958 - mindformers[mindformers/models/utils.py:120] - INFO - num_layers per stage: [[81]] 2024-12-27 13:12:29,958 - mindformers[mindformers/models/utils.py:121] - INFO - Accumulated num_layers per stage: [[81]] 2024-12-27 13:12:29,958 - mindformers[mindformers/models/utils.py:122] - INFO - Pipeline id list: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] 2024-12-27 13:12:29,959 - mindformers[mindformers/models/utils.py:123] - INFO - Interleave id list: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] 2024-12-27 13:12:29,959 - mindformers[mindformers/models/utils.py:133] - INFO - Formative layer_recompute: [[0]] 2024-12-27 13:12:29,959 - mindformers[mindformers/models/utils.py:134] - INFO - Formative select_recompute: {'feed_forward\\.mul': [[0]], 'feed_forward\\.w1\\.activation\\.silu': [[0]]} 2024-12-27 13:12:29,959 - mindformers[mindformers/models/utils.py:135] - INFO - Formative select_comm_recompute: {'.*\\.norm': [[0]]} [WARNING] ME(15286:281473333435408,MainProcess):2024-12-27-13:12:29.983.547 [mindspore/common/parameter.py:850] This interface may be deleted in the future. 2024-12-27 13:12:32,026 - mindformers[mindformers/models/modeling_utils.py:1518] - INFO - model built, but weights is unloaded, since the config has no checkpoint_name_or_path attribute or checkpoint_name_or_path is None. 2024-12-27 13:12:32,026 - mindformers[mindformers/models/llama/llama.py:351] - INFO - Predict run mode:False 2024-12-27 13:12:32,035 - mindformers[mindformers/trainer/trainer.py:218] - INFO - The model instance has been entered, and the model will not be created from model_config 2024-12-27 13:12:32,035 - mindformers[mindformers/trainer/trainer.py:240] - WARNING - Recognizing that a model instance is sent and model_name is None, 2024-12-27 13:12:32,035 - mindformers[mindformers/trainer/trainer.py:242] - WARNING - it is recommended to select a model configuration that corresponds to the support of MindFormers based on the instance model and set model_name. 2024-12-27 13:12:32,035 - mindformers[mindformers/trainer/trainer.py:245] - WARNING - Otherwise, they will default to a general configuration.You are advised to pass instances such as optimizers, metric, tokenizer, and processor 2024-12-27 13:12:32,067 - mindformers[mindformers/trainer/trainer.py:952] - INFO - Load configs in /home/jenkins/mindspore/testcases/testcases/tests/st/networks/mindformers/configs/gpt2/run_gpt2.yaml to build trainer. 2024-12-27 13:12:32,067 - mindformers[mindformers/trainer/trainer.py:1035] - INFO - ..........Init Config.......... 2024-12-27 13:12:32,067 - mindformers[mindformers/trainer/trainer.py:1058] - WARNING - When using the TrainingArguments class, its arguments will override the default config configuration. 2024-12-27 13:12:32,068 - mindformers[mindformers/core/parallel_config.py:39] - INFO - initial moe_config from dict: {'expert_num': 1, 'capacity_factor': 1.05, 'aux_loss_factor': 0.05, 'num_experts_chosen': 1} 2024-12-27 13:12:32,069 - mindformers[mindformers/core/parallel_config.py:45] - INFO - initial recompute_config from dict: {'recompute': False, 'select_recompute': False, 'parallel_optimizer_comm_recompute': False, 'mp_comm_recompute': True, 'recompute_slice_activation': False} 2024-12-27 13:12:32,069 - mindformers[mindformers/core/parallel_config.py:51] - INFO - initial parallel_config from dict: {'data_parallel': 1, 'model_parallel': 1, 'pipeline_stage': 1, 'use_seq_parallel': False, 'micro_batch_num': 1, 'vocab_emb_dp': True, 'gradient_aggregation_group': 4, 'expert_parallel': 1} 2024-12-27 13:12:32,070 - mindformers[mindformers/tools/utils.py:160] - INFO - set output path to '/home/jenkins/mindspore/testcases/testcases/tests/st/networks/large_models/llama/output' 2024-12-27 13:12:32,070 - mindformers[mindformers/tools/utils.py:175] - INFO - set strategy path to './output/strategy/ckpt_strategy_rank_0.ckpt' 2024-12-27 13:12:32,070 - mindformers[mindformers/trainer/base_trainer.py:89] - INFO - Now Running Task is: text_generation, Model is: gpt2 2024-12-27 13:12:32,070 - mindformers[mindformers/trainer/trainer.py:1100] - INFO - ..........Init Model.......... 2024-12-27 13:12:32,071 - mindformers[mindformers/trainer/trainer.py:311] - INFO - ==========Trainer Init Success!========== 2024-12-27 13:12:32,071 - mindformers[mindformers/trainer/trainer.py:833] - INFO - The incoming model will be reinit when parallel config is reconfigured. 2024-12-27 13:12:32,071 - mindformers[mindformers/trainer/trainer.py:1100] - INFO - ..........Init Model.......... 2024-12-27 13:12:32,071 - mindformers[mindformers/trainer/trainer.py:933] - INFO - ..........Reinit Model.......... [WARNING] ME(15286:281473333435408,MainProcess):2024-12-27-13:12:32.954.47 [mindspore/testcases/testcases/tests/st/networks/mindformers/mindformers/modules/transformer/op_parallel_config.py:268] The optimizer shard True in auto_parallel_context is not equal to the optimizer_shard False in the OpParallelConfig. Please check the optimizer_shard to make them consistent. [WARNING] ME(15286:281473333435408,MainProcess):2024-12-27-13:12:32.984.02 [mindspore/testcases/testcases/tests/st/networks/mindformers/mindformers/modules/transformer/op_parallel_config.py:268] The optimizer shard True in auto_parallel_context is not equal to the optimizer_shard False in the OpParallelConfig. Please check the optimizer_shard to make them consistent. 2024-12-27 13:12:32,099 - mindformers[mindformers/models/llama/llama.py:92] - INFO - enable asd op:False 2024-12-27 13:12:32,099 - mindformers[mindformers/models/llama/llama.py:96] - INFO - MoE config is None, use normal FFN 2024-12-27 13:12:32,237 - mindformers[mindformers/models/utils.py:120] - INFO - num_layers per stage: [[10, 10, 10, 10, 10, 10, 10, 10]] 2024-12-27 13:12:32,238 - mindformers[mindformers/models/utils.py:121] - INFO - Accumulated num_layers per stage: [[10, 20, 30, 40, 50, 60, 70, 80]] 2024-12-27 13:12:32,238 - mindformers[mindformers/models/utils.py:122] - INFO - Pipeline id list: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7] 2024-12-27 13:12:32,238 - mindformers[mindformers/models/utils.py:123] - INFO - Interleave id list: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] 2024-12-27 13:12:32,238 - mindformers[mindformers/models/utils.py:133] - INFO - Formative layer_recompute: [[0, 0, 0, 0, 0, 0, 0, 0]] 2024-12-27 13:12:32,239 - mindformers[mindformers/models/utils.py:134] - INFO - Formative select_recompute: {'feed_forward\\.mul': [[0, 0, 0, 0, 0, 0, 0, 0]], 'feed_forward\\.w1\\.activation\\.silu': [[0, 0, 0, 0, 0, 0, 0, 0]]} 2024-12-27 13:12:32,239 - mindformers[mindformers/models/utils.py:135] - INFO - Formative select_comm_recompute: {'.*\\.norm': [[0, 0, 0, 0, 0, 0, 0, 0]]} 2024-12-27 13:12:34,403 - mindformers[mindformers/models/modeling_utils.py:1518] - INFO - model built, but weights is unloaded, since the config has no checkpoint_name_or_path attribute or checkpoint_name_or_path is None. 2024-12-27 13:12:34,404 - mindformers[mindformers/models/llama/llama.py:351] - INFO - Predict run mode:False 2024-12-27 13:12:34,422 - mindformers[mindformers/trainer/base_trainer.py:176] - INFO - Pipeline parallel was opened: pipeline_stages = 8, full batch is True, gradient_accumulation_steps will not take effect in pipeline parallel, global batch size will be changed: global_batch_size = batch_size * data_parallel * micro_batch_num * micro_batch_interleave_num = 1024 = 32 * 1 * 32 * 1). 2024-12-27 13:12:34,423 - mindformers[mindformers/trainer/base_trainer.py:285] - WARNING - When using the pipeline parallel mode, the MFPipelineWithLossScaleCell class is used by default. 2024-12-27 13:12:34,423 - mindformers[mindformers/trainer/base_trainer.py:292] - INFO - PipelineWrapper under evaluate or predict mode will not take effect. 2024-12-27 13:12:34,423 - mindformers[mindformers/trainer/base_trainer.py:640] - INFO - .........Build Dataset For Train.......... 2024-12-27 13:12:34,423 - mindformers[mindformers/trainer/base_trainer.py:369] - INFO - .........Build Dataset From Config.......... 2024-12-27 13:12:34,423 - mindformers[mindformers/trainer/base_trainer.py:642] - INFO - Create train dataset finish, dataset size:5 2024-12-27 13:12:34,424 - mindformers[mindformers/trainer/utils.py:168] - WARNING - Sink mode is False, per epoch size is invalid, it will reset -1. 2024-12-27 13:12:34,424 - mindformers[mindformers/trainer/utils.py:173] - INFO - Will be Training epochs:1, sink_size:-1 2024-12-27 13:12:34,424 - mindformers[mindformers/trainer/utils.py:174] - INFO - Create training dataset finish, dataset size:5 2024-12-27 13:12:34,424 - mindformers[mindformers/trainer/base_trainer.py:673] - INFO - .........Build Net For Train.......... 2024-12-27 13:12:34,500 - mindformers[mindformers/trainer/base_trainer.py:559] - INFO - Network Parameters: 65285 M. 2024-12-27 13:12:34,500 - mindformers[mindformers/trainer/base_trainer.py:695] - INFO - .........Build Optimizer For Train.......... 2024-12-27 13:12:34,500 - mindformers[mindformers/trainer/base_trainer.py:442] - INFO - .........Build Optimizer From Config.......... 2024-12-27 13:12:34,500 - mindformers[mindformers/trainer/base_trainer.py:475] - INFO - .........Build LR Schedule From Config.......... 2024-12-27 13:12:34,514 - mindformers[mindformers/trainer/optimizer_grouped_parameters.py:77] - WARNING - dynamic_lr_schedule will be reset and invalid when layer_scale is False. 2024-12-27 13:12:34,532 - mindformers[mindformers/trainer/optimizer_grouped_parameters.py:116] - INFO - Param groups = { "decay": { "weight_decay": 0.0, "params": [ "model.tok_embeddings.embedding_weight", "model.layers.0.attention.wq.weight", "model.layers.0.attention.wk.weight", "model.layers.0.attention.wv.weight", "model.layers.0.attention.wo.weight", "model.layers.0.feed_forward.w1.weight", "model.layers.0.feed_forward.w2.weight", "model.layers.0.feed_forward.w3.weight", "model.layers.1.attention.wq.weight", "model.layers.1.attention.wk.weight", "model.layers.1.attention.wv.weight", "model.layers.1.attention.wo.weight", "model.layers.1.feed_forward.w1.weight", "model.layers.1.feed_forward.w2.weight", "model.layers.1.feed_forward.w3.weight", "model.layers.2.attention.wq.weight", "model.layers.2.attention.wk.weight", "model.layers.2.attention.wv.weight", "model.layers.2.attention.wo.weight", "model.layers.2.feed_forward.w1.weight", "model.layers.2.feed_forward.w2.weight", "model.layers.2.feed_forward.w3.weight", "model.layers.3.attention.wq.weight", "model.layers.3.attention.wk.weight", "model.layers.3.attention.wv.weight", "model.layers.3.attention.wo.weight", "model.layers.3.feed_forward.w1.weight", "model.layers.3.feed_forward.w2.weight", "model.layers.3.feed_forward.w3.weight", "model.layers.4.attention.wq.weight", "model.layers.4.attention.wk.weight", "model.layers.4.attention.wv.weight", "model.layers.4.attention.wo.weight", "model.layers.4.feed_forward.w1.weight", "model.layers.4.feed_forward.w2.weight", "model.layers.4.feed_forward.w3.weight", "model.layers.5.attention.wq.weight", "model.layers.5.attention.wk.weight", "model.layers.5.attention.wv.weight", "model.layers.5.attention.wo.weight", "model.layers.5.feed_forward.w1.weight", "model.layers.5.feed_forward.w2.weight", "model.layers.5.feed_forward.w3.weight", "model.layers.6.attention.wq.weight", "model.layers.6.attention.wk.weight", "model.layers.6.attention.wv.weight", "model.layers.6.attention.wo.weight", "model.layers.6.feed_forward.w1.weight", "model.layers.6.feed_forward.w2.weight", "model.layers.6.feed_forward.w3.weight", "model.layers.7.attention.wq.weight", "model.layers.7.attention.wk.weight", "model.layers.7.attention.wv.weight", "model.layers.7.attention.wo.weight", "model.layers.7.feed_forward.w1.weight", "model.layers.7.feed_forward.w2.weight", "model.layers.7.feed_forward.w3.weight", "model.layers.8.attention.wq.weight", "model.layers.8.attention.wk.weight", "model.layers.8.attention.wv.weight", "model.layers.8.attention.wo.weight", "model.layers.8.feed_forward.w1.weight", "model.layers.8.feed_forward.w2.weight", "model.layers.8.feed_forward.w3.weight", "model.layers.9.attention.wq.weight", "model.layers.9.attention.wk.weight", "model.layers.9.attention.wv.weight", "model.layers.9.attention.wo.weight", "model.layers.9.feed_forward.w1.weight", "model.layers.9.feed_forward.w2.weight", "model.layers.9.feed_forward.w3.weight", "model.layers.10.attention.wq.weight", "model.layers.10.attention.wk.weight", "model.layers.10.attention.wv.weight", "model.layers.10.attention.wo.weight", "model.layers.10.feed_forward.w1.weight", "model.layers.10.feed_forward.w2.weight", "model.layers.10.feed_forward.w3.weight", "model.layers.11.attention.wq.weight", "model.layers.11.attention.wk.weight", "model.layers.11.attention.wv.weight", "model.layers.11.attention.wo.weight", "model.layers.11.feed_forward.w1.weight", "model.layers.11.feed_forward.w2.weight", "model.layers.11.feed_forward.w3.weight", "model.layers.12.attention.wq.weight", "model.layers.12.attention.wk.weight", "model.layers.12.attention.wv.weight", "model.layers.12.attention.wo.weight", "model.layers.12.feed_forward.w1.weight", "model.layers.12.feed_forward.w2.weight", "model.layers.12.feed_forward.w3.weight", "model.layers.13.attention.wq.weight", "model.layers.13.attention.wk.weight", "model.layers.13.attention.wv.weight", "model.layers.13.attention.wo.weight", "model.layers.13.feed_forward.w1.weight", "model.layers.13.feed_forward.w2.weight", "model.layers.13.feed_forward.w3.weight", "model.layers.14.attention.wq.weight", "model.layers.14.attention.wk.weight", "model.layers.14.attention.wv.weight", "model.layers.14.attention.wo.weight", "model.layers.14.feed_forward.w1.weight", "model.layers.14.feed_forward.w2.weight", "model.layers.14.feed_forward.w3.weight", "model.layers.15.attention.wq.weight", "model.layers.15.attention.wk.weight", "model.layers.15.attention.wv.weight", "model.layers.15.attention.wo.weight", "model.layers.15.feed_forward.w1.weight", "model.layers.15.feed_forward.w2.weight", "model.layers.15.feed_forward.w3.weight", "model.layers.16.attention.wq.weight", "model.layers.16.attention.wk.weight", "model.layers.16.attention.wv.weight", "model.layers.16.attention.wo.weight", "model.layers.16.feed_forward.w1.weight", "model.layers.16.feed_forward.w2.weight", "model.layers.16.feed_forward.w3.weight", "model.layers.17.attention.wq.weight", "model.layers.17.attention.wk.weight", "model.layers.17.attention.wv.weight", "model.layers.17.attention.wo.weight", "model.layers.17.feed_forward.w1.weight", "model.layers.17.feed_forward.w2.weight", "model.layers.17.feed_forward.w3.weight", "model.layers.18.attention.wq.weight", "model.layers.18.attention.wk.weight", "model.layers.18.attention.wv.weight", "model.layers.18.attention.wo.weight", "model.layers.18.feed_forward.w1.weight", "model.layers.18.feed_forward.w2.weight", "model.layers.18.feed_forward.w3.weight", "model.layers.19.attention.wq.weight", "model.layers.19.attention.wk.weight", "model.layers.19.attention.wv.weight", "model.layers.19.attention.wo.weight", "model.layers.19.feed_forward.w1.weight", "model.layers.19.feed_forward.w2.weight", "model.layers.19.feed_forward.w3.weight", "model.layers.20.attention.wq.weight", "model.layers.20.attention.wk.weight", "model.layers.20.attention.wv.weight", "model.layers.20.attention.wo.weight", "model.layers.20.feed_forward.w1.weight", "model.layers.20.feed_forward.w2.weight", "model.layers.20.feed_forward.w3.weight", "model.layers.21.attention.wq.weight", "model.layers.21.attention.wk.weight", "model.layers.21.attention.wv.weight", "model.layers.21.attention.wo.weight", "model.layers.21.feed_forward.w1.weight", "model.layers.21.feed_forward.w2.weight", "model.layers.21.feed_forward.w3.weight", "model.layers.22.attention.wq.weight", "model.layers.22.attention.wk.weight", "model.layers.22.attention.wv.weight", "model.layers.22.attention.wo.weight", "model.layers.22.feed_forward.w1.weight", "model.layers.22.feed_forward.w2.weight", "model.layers.22.feed_forward.w3.weight", "model.layers.23.attention.wq.weight", "model.layers.23.attention.wk.weight", "model.layers.23.attention.wv.weight", "model.layers.23.attention.wo.weight", "model.layers.23.feed_forward.w1.weight", "model.layers.23.feed_forward.w2.weight", "model.layers.23.feed_forward.w3.weight", "model.layers.24.attention.wq.weight", "model.layers.24.attention.wk.weight", "model.layers.24.attention.wv.weight", "model.layers.24.attention.wo.weight", "model.layers.24.feed_forward.w1.weight", "model.layers.24.feed_forward.w2.weight", "model.layers.24.feed_forward.w3.weight", "model.layers.25.attention.wq.weight", "model.layers.25.attention.wk.weight", "model.layers.25.attention.wv.weight", "model.layers.25.attention.wo.weight", "model.layers.25.feed_forward.w1.weight", "model.layers.25.feed_forward.w2.weight", "model.layers.25.feed_forward.w3.weight", "model.layers.26.attention.wq.weight", "model.layers.26.attention.wk.weight", "model.layers.26.attention.wv.weight", "model.layers.26.attention.wo.weight", "model.layers.26.feed_forward.w1.weight", "model.layers.26.feed_forward.w2.weight", "model.layers.26.feed_forward.w3.weight", "model.layers.27.attention.wq.weight", "model.layers.27.attention.wk.weight", "model.layers.27.attention.wv.weight", "model.layers.27.attention.wo.weight", "model.layers.27.feed_forward.w1.weight", "model.layers.27.feed_forward.w2.weight", "model.layers.27.feed_forward.w3.weight", "model.layers.28.attention.wq.weight", "model.layers.28.attention.wk.weight", "model.layers.28.attention.wv.weight", "model.layers.28.attention.wo.weight", "model.layers.28.feed_forward.w1.weight", "model.layers.28.feed_forward.w2.weight", "model.layers.28.feed_forward.w3.weight", "model.layers.29.attention.wq.weight", "model.layers.29.attention.wk.weight", "model.layers.29.attention.wv.weight", "model.layers.29.attention.wo.weight", "model.layers.29.feed_forward.w1.weight", "model.layers.29.feed_forward.w2.weight", "model.layers.29.feed_forward.w3.weight", "model.layers.30.attention.wq.weight", "model.layers.30.attention.wk.weight", "model.layers.30.attention.wv.weight", "model.layers.30.attention.wo.weight", "model.layers.30.feed_forward.w1.weight", "model.layers.30.feed_forward.w2.weight", "model.layers.30.feed_forward.w3.weight", "model.layers.31.attention.wq.weight", "model.layers.31.attention.wk.weight", "model.layers.31.attention.wv.weight", "model.layers.31.attention.wo.weight", "model.layers.31.feed_forward.w1.weight", "model.layers.31.feed_forward.w2.weight", "model.layers.31.feed_forward.w3.weight", "model.layers.32.attention.wq.weight", "model.layers.32.attention.wk.weight", "model.layers.32.attention.wv.weight", "model.layers.32.attention.wo.weight", "model.layers.32.feed_forward.w1.weight", "model.layers.32.feed_forward.w2.weight", "model.layers.32.feed_forward.w3.weight", "model.layers.33.attention.wq.weight", "model.layers.33.attention.wk.weight", "model.layers.33.attention.wv.weight", "model.layers.33.attention.wo.weight", "model.layers.33.feed_forward.w1.weight", "model.layers.33.feed_forward.w2.weight", "model.layers.33.feed_forward.w3.weight", "model.layers.34.attention.wq.weight", "model.layers.34.attention.wk.weight", "model.layers.34.attention.wv.weight", "model.layers.34.attention.wo.weight", "model.layers.34.feed_forward.w1.weight", "model.layers.34.feed_forward.w2.weight", "model.layers.34.feed_forward.w3.weight", "model.layers.35.attention.wq.weight", "model.layers.35.attention.wk.weight", "model.layers.35.attention.wv.weight", "model.layers.35.attention.wo.weight", "model.layers.35.feed_forward.w1.weight", "model.layers.35.feed_forward.w2.weight", "model.layers.35.feed_forward.w3.weight", "model.layers.36.attention.wq.weight", "model.layers.36.attention.wk.weight", "model.layers.36.attention.wv.weight", "model.layers.36.attention.wo.weight", "model.layers.36.feed_forward.w1.weight", "model.layers.36.feed_forward.w2.weight", "model.layers.36.feed_forward.w3.weight", "model.layers.37.attention.wq.weight", "model.layers.37.attention.wk.weight", "model.layers.37.attention.wv.weight", "model.layers.37.attention.wo.weight", "model.layers.37.feed_forward.w1.weight", "model.layers.37.feed_forward.w2.weight", "model.layers.37.feed_forward.w3.weight", "model.layers.38.attention.wq.weight", "model.layers.38.attention.wk.weight", "model.layers.38.attention.wv.weight", "model.layers.38.attention.wo.weight", "model.layers.38.feed_forward.w1.weight", "model.layers.38.feed_forward.w2.weight", "model.layers.38.feed_forward.w3.weight", "model.layers.39.attention.wq.weight", "model.layers.39.attention.wk.weight", "model.layers.39.attention.wv.weight", "model.layers.39.attention.wo.weight", "model.layers.39.feed_forward.w1.weight", "model.layers.39.feed_forward.w2.weight", "model.layers.39.feed_forward.w3.weight", "model.layers.40.attention.wq.weight", "model.layers.40.attention.wk.weight", "model.layers.40.attention.wv.weight", "model.layers.40.attention.wo.weight", "model.layers.40.feed_forward.w1.weight", "model.layers.40.feed_forward.w2.weight", "model.layers.40.feed_forward.w3.weight", "model.layers.41.attention.wq.weight", "model.layers.41.attention.wk.weight", "model.layers.41.attention.wv.weight", "model.layers.41.attention.wo.weight", "model.layers.41.feed_forward.w1.weight", "model.layers.41.feed_forward.w2.weight", "model.layers.41.feed_forward.w3.weight", "model.layers.42.attention.wq.weight", "model.layers.42.attention.wk.weight", "model.layers.42.attention.wv.weight", "model.layers.42.attention.wo.weight", "model.layers.42.feed_forward.w1.weight", "model.layers.42.feed_forward.w2.weight", "model.layers.42.feed_forward.w3.weight", "model.layers.43.attention.wq.weight", "model.layers.43.attention.wk.weight", "model.layers.43.attention.wv.weight", "model.layers.43.attention.wo.weight", "model.layers.43.feed_forward.w1.weight", "model.layers.43.feed_forward.w2.weight", "model.layers.43.feed_forward.w3.weight", "model.layers.44.attention.wq.weight", "model.layers.44.attention.wk.weight", "model.layers.44.attention.wv.weight", "model.layers.44.attention.wo.weight", "model.layers.44.feed_forward.w1.weight", "model.layers.44.feed_forward.w2.weight", "model.layers.44.feed_forward.w3.weight", "model.layers.45.attention.wq.weight", "model.layers.45.attention.wk.weight", "model.layers.45.attention.wv.weight", "model.layers.45.attention.wo.weight", "model.layers.45.feed_forward.w1.weight", "model.layers.45.feed_forward.w2.weight", "model.layers.45.feed_forward.w3.weight", "model.layers.46.attention.wq.weight", "model.layers.46.attention.wk.weight", "model.layers.46.attention.wv.weight", "model.layers.46.attention.wo.weight", "model.layers.46.feed_forward.w1.weight", "model.layers.46.feed_forward.w2.weight", "model.layers.46.feed_forward.w3.weight", "model.layers.47.attention.wq.weight", "model.layers.47.attention.wk.weight", "model.layers.47.attention.wv.weight", "model.layers.47.attention.wo.weight", "model.layers.47.feed_forward.w1.weight", "model.layers.47.feed_forward.w2.weight", "model.layers.47.feed_forward.w3.weight", "model.layers.48.attention.wq.weight", "model.layers.48.attention.wk.weight", "model.layers.48.attention.wv.weight", "model.layers.48.attention.wo.weight", "model.layers.48.feed_forward.w1.weight", "model.layers.48.feed_forward.w2.weight", "model.layers.48.feed_forward.w3.weight", "model.layers.49.attention.wq.weight", "model.layers.49.attention.wk.weight", "model.layers.49.attention.wv.weight", "model.layers.49.attention.wo.weight", "model.layers.49.feed_forward.w1.weight", "model.layers.49.feed_forward.w2.weight", "model.layers.49.feed_forward.w3.weight", "model.layers.50.attention.wq.weight", "model.layers.50.attention.wk.weight", "model.layers.50.attention.wv.weight", "model.layers.50.attention.wo.weight", "model.layers.50.feed_forward.w1.weight", "model.layers.50.feed_forward.w2.weight", "model.layers.50.feed_forward.w3.weight", "model.layers.51.attention.wq.weight", "model.layers.51.attention.wk.weight", "model.layers.51.attention.wv.weight", "model.layers.51.attention.wo.weight", "model.layers.51.feed_forward.w1.weight", "model.layers.51.feed_forward.w2.weight", "model.layers.51.feed_forward.w3.weight", "model.layers.52.attention.wq.weight", "model.layers.52.attention.wk.weight", "model.layers.52.attention.wv.weight", "model.layers.52.attention.wo.weight", "model.layers.52.feed_forward.w1.weight", "model.layers.52.feed_forward.w2.weight", "model.layers.52.feed_forward.w3.weight", "model.layers.53.attention.wq.weight", "model.layers.53.attention.wk.weight", "model.layers.53.attention.wv.weight", "model.layers.53.attention.wo.weight", "model.layers.53.feed_forward.w1.weight", "model.layers.53.feed_forward.w2.weight", "model.layers.53.feed_forward.w3.weight", "model.layers.54.attention.wq.weight", "model.layers.54.attention.wk.weight", "model.layers.54.attention.wv.weight", "model.layers.54.attention.wo.weight", "model.layers.54.feed_forward.w1.weight", "model.layers.54.feed_forward.w2.weight", "model.layers.54.feed_forward.w3.weight", "model.layers.55.attention.wq.weight", "model.layers.55.attention.wk.weight", "model.layers.55.attention.wv.weight", "model.layers.55.attention.wo.weight", "model.layers.55.feed_forward.w1.weight", "model.layers.55.feed_forward.w2.weight", "model.layers.55.feed_forward.w3.weight", "model.layers.56.attention.wq.weight", "model.layers.56.attention.wk.weight", "model.layers.56.attention.wv.weight", "model.layers.56.attention.wo.weight", "model.layers.56.feed_forward.w1.weight", "model.layers.56.feed_forward.w2.weight", "model.layers.56.feed_forward.w3.weight", "model.layers.57.attention.wq.weight", "model.layers.57.attention.wk.weight", "model.layers.57.attention.wv.weight", "model.layers.57.attention.wo.weight", "model.layers.57.feed_forward.w1.weight", "model.layers.57.feed_forward.w2.weight", "model.layers.57.feed_forward.w3.weight", "model.layers.58.attention.wq.weight", "model.layers.58.attention.wk.weight", "model.layers.58.attention.wv.weight", "model.layers.58.attention.wo.weight", "model.layers.58.feed_forward.w1.weight", "model.layers.58.feed_forward.w2.weight", "model.layers.58.feed_forward.w3.weight", "model.layers.59.attention.wq.weight", "model.layers.59.attention.wk.weight", "model.layers.59.attention.wv.weight", "model.layers.59.attention.wo.weight", "model.layers.59.feed_forward.w1.weight", "model.layers.59.feed_forward.w2.weight", "model.layers.59.feed_forward.w3.weight", "model.layers.60.attention.wq.weight", "model.layers.60.attention.wk.weight", "model.layers.60.attention.wv.weight", "model.layers.60.attention.wo.weight", "model.layers.60.feed_forward.w1.weight", "model.layers.60.feed_forward.w2.weight", "model.layers.60.feed_forward.w3.weight", "model.layers.61.attention.wq.weight", "model.layers.61.attention.wk.weight", "model.layers.61.attention.wv.weight", "model.layers.61.attention.wo.weight", "model.layers.61.feed_forward.w1.weight", "model.layers.61.feed_forward.w2.weight", "model.layers.61.feed_forward.w3.weight", "model.layers.62.attention.wq.weight", "model.layers.62.attention.wk.weight", "model.layers.62.attention.wv.weight", "model.layers.62.attention.wo.weight", "model.layers.62.feed_forward.w1.weight", "model.layers.62.feed_forward.w2.weight", "model.layers.62.feed_forward.w3.weight", "model.layers.63.attention.wq.weight", "model.layers.63.attention.wk.weight", "model.layers.63.attention.wv.weight", "model.layers.63.attention.wo.weight", "model.layers.63.feed_forward.w1.weight", "model.layers.63.feed_forward.w2.weight", "model.layers.63.feed_forward.w3.weight", "model.layers.64.attention.wq.weight", "model.layers.64.attention.wk.weight", "model.layers.64.attention.wv.weight", "model.layers.64.attention.wo.weight", "model.layers.64.feed_forward.w1.weight", "model.layers.64.feed_forward.w2.weight", "model.layers.64.feed_forward.w3.weight", "model.layers.65.attention.wq.weight", "model.layers.65.attention.wk.weight", "model.layers.65.attention.wv.weight", "model.layers.65.attention.wo.weight", "model.layers.65.feed_forward.w1.weight", "model.layers.65.feed_forward.w2.weight", "model.layers.65.feed_forward.w3.weight", "model.layers.66.attention.wq.weight", "model.layers.66.attention.wk.weight", "model.layers.66.attention.wv.weight", "model.layers.66.attention.wo.weight", "model.layers.66.feed_forward.w1.weight", "model.layers.66.feed_forward.w2.weight", "model.layers.66.feed_forward.w3.weight", "model.layers.67.attention.wq.weight", "model.layers.67.attention.wk.weight", "model.layers.67.attention.wv.weight", "model.layers.67.attention.wo.weight", "model.layers.67.feed_forward.w1.weight", "model.layers.67.feed_forward.w2.weight", "model.layers.67.feed_forward.w3.weight", "model.layers.68.attention.wq.weight", "model.layers.68.attention.wk.weight", "model.layers.68.attention.wv.weight", "model.layers.68.attention.wo.weight", "model.layers.68.feed_forward.w1.weight", "model.layers.68.feed_forward.w2.weight", "model.layers.68.feed_forward.w3.weight", "model.layers.69.attention.wq.weight", "model.layers.69.attention.wk.weight", "model.layers.69.attention.wv.weight", "model.layers.69.attention.wo.weight", "model.layers.69.feed_forward.w1.weight", "model.layers.69.feed_forward.w2.weight", "model.layers.69.feed_forward.w3.weight", "model.layers.70.attention.wq.weight", "model.layers.70.attention.wk.weight", "model.layers.70.attention.wv.weight", "model.layers.70.attention.wo.weight", "model.layers.70.feed_forward.w1.weight", "model.layers.70.feed_forward.w2.weight", "model.layers.70.feed_forward.w3.weight", "model.layers.71.attention.wq.weight", "model.layers.71.attention.wk.weight", "model.layers.71.attention.wv.weight", "model.layers.71.attention.wo.weight", "model.layers.71.feed_forward.w1.weight", "model.layers.71.feed_forward.w2.weight", "model.layers.71.feed_forward.w3.weight", "model.layers.72.attention.wq.weight", "model.layers.72.attention.wk.weight", "model.layers.72.attention.wv.weight", "model.layers.72.attention.wo.weight", "model.layers.72.feed_forward.w1.weight", "model.layers.72.feed_forward.w2.weight", "model.layers.72.feed_forward.w3.weight", "model.layers.73.attention.wq.weight", "model.layers.73.attention.wk.weight", "model.layers.73.attention.wv.weight", "model.layers.73.attention.wo.weight", "model.layers.73.feed_forward.w1.weight", "model.layers.73.feed_forward.w2.weight", "model.layers.73.feed_forward.w3.weight", "model.layers.74.attention.wq.weight", "model.layers.74.attention.wk.weight", "model.layers.74.attention.wv.weight", "model.layers.74.attention.wo.weight", "model.layers.74.feed_forward.w1.weight", "model.layers.74.feed_forward.w2.weight", "model.layers.74.feed_forward.w3.weight", "model.layers.75.attention.wq.weight", "model.layers.75.attention.wk.weight", "model.layers.75.attention.wv.weight", "model.layers.75.attention.wo.weight", "model.layers.75.feed_forward.w1.weight", "model.layers.75.feed_forward.w2.weight", "model.layers.75.feed_forward.w3.weight", "model.layers.76.attention.wq.weight", "model.layers.76.attention.wk.weight", "model.layers.76.attention.wv.weight", "model.layers.76.attention.wo.weight", "model.layers.76.feed_forward.w1.weight", "model.layers.76.feed_forward.w2.weight", "model.layers.76.feed_forward.w3.weight", "model.layers.77.attention.wq.weight", "model.layers.77.attention.wk.weight", "model.layers.77.attention.wv.weight", "model.layers.77.attention.wo.weight", "model.layers.77.feed_forward.w1.weight", "model.layers.77.feed_forward.w2.weight", "model.layers.77.feed_forward.w3.weight", "model.layers.78.attention.wq.weight", "model.layers.78.attention.wk.weight", "model.layers.78.attention.wv.weight", "model.layers.78.attention.wo.weight", "model.layers.78.feed_forward.w1.weight", "model.layers.78.feed_forward.w2.weight", "model.layers.78.feed_forward.w3.weight", "model.layers.79.attention.wq.weight", "model.layers.79.attention.wk.weight", "model.layers.79.attention.wv.weight", "model.layers.79.attention.wo.weight", "model.layers.79.feed_forward.w1.weight", "model.layers.79.feed_forward.w2.weight", "model.layers.79.feed_forward.w3.weight", "lm_head.weight" ] }, "no_decay": { "weight_decay": 0.0, "params": [ "model.layers.0.ffn_norm.weight", "model.layers.0.attention_norm.weight", "model.layers.1.ffn_norm.weight", "model.layers.1.attention_norm.weight", "model.layers.2.ffn_norm.weight", "model.layers.2.attention_norm.weight", "model.layers.3.ffn_norm.weight", "model.layers.3.attention_norm.weight", "model.layers.4.ffn_norm.weight", "model.layers.4.attention_norm.weight", "model.layers.5.ffn_norm.weight", "model.layers.5.attention_norm.weight", "model.layers.6.ffn_norm.weight", "model.layers.6.attention_norm.weight", "model.layers.7.ffn_norm.weight", "model.layers.7.attention_norm.weight", "model.layers.8.ffn_norm.weight", "model.layers.8.attention_norm.weight", "model.layers.9.ffn_norm.weight", "model.layers.9.attention_norm.weight", "model.layers.10.ffn_norm.weight", "model.layers.10.attention_norm.weight", "model.layers.11.ffn_norm.weight", "model.layers.11.attention_norm.weight", "model.layers.12.ffn_norm.weight", "model.layers.12.attention_norm.weight", "model.layers.13.ffn_norm.weight", "model.layers.13.attention_norm.weight", "model.layers.14.ffn_norm.weight", "model.layers.14.attention_norm.weight", "model.layers.15.ffn_norm.weight", "model.layers.15.attention_norm.weight", "model.layers.16.ffn_norm.weight", "model.layers.16.attention_norm.weight", "model.layers.17.ffn_norm.weight", "model.layers.17.attention_norm.weight", "model.layers.18.ffn_norm.weight", "model.layers.18.attention_norm.weight", "model.layers.19.ffn_norm.weight", "model.layers.19.attention_norm.weight", "model.layers.20.ffn_norm.weight", "model.layers.20.attention_norm.weight", "model.layers.21.ffn_norm.weight", "model.layers.21.attention_norm.weight", "model.layers.22.ffn_norm.weight", "model.layers.22.attention_norm.weight", "model.layers.23.ffn_norm.weight", "model.layers.23.attention_norm.weight", "model.layers.24.ffn_norm.weight", "model.layers.24.attention_norm.weight", "model.layers.25.ffn_norm.weight", "model.layers.25.attention_norm.weight", "model.layers.26.ffn_norm.weight", "model.layers.26.attention_norm.weight", "model.layers.27.ffn_norm.weight", "model.layers.27.attention_norm.weight", "model.layers.28.ffn_norm.weight", "model.layers.28.attention_norm.weight", "model.layers.29.ffn_norm.weight", "model.layers.29.attention_norm.weight", "model.layers.30.ffn_norm.weight", "model.layers.30.attention_norm.weight", "model.layers.31.ffn_norm.weight", "model.layers.31.attention_norm.weight", "model.layers.32.ffn_norm.weight", "model.layers.32.attention_norm.weight", "model.layers.33.ffn_norm.weight", "model.layers.33.attention_norm.weight", "model.layers.34.ffn_norm.weight", "model.layers.34.attention_norm.weight", "model.layers.35.ffn_norm.weight", "model.layers.35.attention_norm.weight", "model.layers.36.ffn_norm.weight", "model.layers.36.attention_norm.weight", "model.layers.37.ffn_norm.weight", "model.layers.37.attention_norm.weight", "model.layers.38.ffn_norm.weight", "model.layers.38.attention_norm.weight", "model.layers.39.ffn_norm.weight", "model.layers.39.attention_norm.weight", "model.layers.40.ffn_norm.weight", "model.layers.40.attention_norm.weight", "model.layers.41.ffn_norm.weight", "model.layers.41.attention_norm.weight", "model.layers.42.ffn_norm.weight", "model.layers.42.attention_norm.weight", "model.layers.43.ffn_norm.weight", "model.layers.43.attention_norm.weight", "model.layers.44.ffn_norm.weight", "model.layers.44.attention_norm.weight", "model.layers.45.ffn_norm.weight", "model.layers.45.attention_norm.weight", "model.layers.46.ffn_norm.weight", "model.layers.46.attention_norm.weight", "model.layers.47.ffn_norm.weight", "model.layers.47.attention_norm.weight", "model.layers.48.ffn_norm.weight", "model.layers.48.attention_norm.weight", "model.layers.49.ffn_norm.weight", "model.layers.49.attention_norm.weight", "model.layers.50.ffn_norm.weight", "model.layers.50.attention_norm.weight", "model.layers.51.ffn_norm.weight", "model.layers.51.attention_norm.weight", "model.layers.52.ffn_norm.weight", "model.layers.52.attention_norm.weight", "model.layers.53.ffn_norm.weight", "model.layers.53.attention_norm.weight", "model.layers.54.ffn_norm.weight", "model.layers.54.attention_norm.weight", "model.layers.55.ffn_norm.weight", "model.layers.55.attention_norm.weight", "model.layers.56.ffn_norm.weight", "model.layers.56.attention_norm.weight", "model.layers.57.ffn_norm.weight", "model.layers.57.attention_norm.weight", "model.layers.58.ffn_norm.weight", "model.layers.58.attention_norm.weight", "model.layers.59.ffn_norm.weight", "model.layers.59.attention_norm.weight", "model.layers.60.ffn_norm.weight", "model.layers.60.attention_norm.weight", "model.layers.61.ffn_norm.weight", "model.layers.61.attention_norm.weight", "model.layers.62.ffn_norm.weight", "model.layers.62.attention_norm.weight", "model.layers.63.ffn_norm.weight", "model.layers.63.attention_norm.weight", "model.layers.64.ffn_norm.weight", "model.layers.64.attention_norm.weight", "model.layers.65.ffn_norm.weight", "model.layers.65.attention_norm.weight", "model.layers.66.ffn_norm.weight", "model.layers.66.attention_norm.weight", "model.layers.67.ffn_norm.weight", "model.layers.67.attention_norm.weight", "model.layers.68.ffn_norm.weight", "model.layers.68.attention_norm.weight", "model.layers.69.ffn_norm.weight", "model.layers.69.attention_norm.weight", "model.layers.70.ffn_norm.weight", "model.layers.70.attention_norm.weight", "model.layers.71.ffn_norm.weight", "model.layers.71.attention_norm.weight", "model.layers.72.ffn_norm.weight", "model.layers.72.attention_norm.weight", "model.layers.73.ffn_norm.weight", "model.layers.73.attention_norm.weight", "model.layers.74.ffn_norm.weight", "model.layers.74.attention_norm.weight", "model.layers.75.ffn_norm.weight", "model.layers.75.attention_norm.weight", "model.layers.76.ffn_norm.weight", "model.layers.76.attention_norm.weight", "model.layers.77.ffn_norm.weight", "model.layers.77.attention_norm.weight", "model.layers.78.ffn_norm.weight", "model.layers.78.attention_norm.weight", "model.layers.79.ffn_norm.weight", "model.layers.79.attention_norm.weight", "model.norm_out.weight" ] } } 2024-12-27 13:12:35,283 - mindformers[mindformers/trainer/base_trainer.py:700] - INFO - .........Build Running Wrapper From Config For Train.......... 2024-12-27 13:12:35,283 - mindformers[mindformers/trainer/base_trainer.py:512] - INFO - .........Build Model Wrapper for Train From Config.......... 2024-12-27 13:12:35,642 - mindformers[mindformers/trainer/base_trainer.py:704] - INFO - .........Build Callbacks For Train.......... 2024-12-27 13:12:35,643 - mindformers[mindformers/trainer/base_trainer.py:738] - INFO - .........Starting Init Train Model.......... 2024-12-27 13:12:35,644 - mindformers[mindformers/trainer/base_trainer.py:784] - INFO - .........Starting Training Model.......... {'auto_trans_ckpt': False, 'auto_tune': False, 'autotune_per_step': 10, 'callbacks': [OrderedDict([('type', 'MFLossMonitor'), ('per_print_times', 1)])], 'context': {'device_id': 0, 'device_target': 'Ascend', 'enable_graph_kernel': False, 'graph_kernel_flags': '--opt_level=0', 'max_call_depth': 10000, 'max_device_memory': '1024GB', 'mode': 0, 'save_graphs': False, 'save_graphs_path': './graph'}, 'data_seed': None, 'data_size': 5, 'device_num': 32, 'do_eval': False, 'do_predict': False, 'do_train': False, 'eval_callbacks': [OrderedDict([('type', 'ObsMonitor')])], 'eval_dataset': {'auto_tune': False, 'autotune_per_step': 10, 'batch_size': 1024, 'data_loader': {'dataset_dir': '', 'shuffle': False, 'type': 'MindDataset'}, 'do_eval': True, 'drop_remainder': False, 'filepath_prefix': './autotune', 'input_columns': ['input_ids', 'attention_mask'], 'num_parallel_workers': 8, 'numa_enable': False, 'output_columns': ['input_ids', 'attention_mask'], 'prefetch_size': 1, 'profile': False, 'python_multiprocessing': False, 'repeat': 1, 'seed': 42}, 'eval_dataset_task': {'dataset_config': {'auto_tune': False, 'autotune_per_step': 10, 'batch_size': 1024, 'data_loader': {'dataset_dir': '', 'shuffle': False, 'type': 'MindDataset'}, 'do_eval': True, 'drop_remainder': False, 'filepath_prefix': './autotune', 'input_columns': ['input_ids', 'attention_mask'], 'num_parallel_workers': 8, 'numa_enable': False, 'output_columns': ['input_ids', 'attention_mask'], 'prefetch_size': 1, 'profile': False, 'python_multiprocessing': False, 'repeat': 1, 'seed': 42}, 'type': 'CausalLanguageModelDataset'}, 'eval_epoch_interval': None, 'eval_step_interval': None, 'filepath_prefix': './autotune', 'hub_always_push': False, 'hub_model_id': None, 'hub_private_repo': False, 'hub_strategy': 'every_save', 'hub_token': None, 'ignore_data_skip': False, 'init_start_profile': False, 'layer_decay': 0.65, 'layer_scale': False, 'load_checkpoint': None, 'lr_scale': False, 'lr_scale_factor': 256, 'lr_schedule': {'learning_rate': 5e-05, 'lr_end': 1e-06, 'total_steps': 5, 'type': 'cosine', 'warmup_lr_init': 0.0, 'warmup_steps': 0}, 'metric': [OrderedDict([('type', 'PerplexityMetric')])], 'micro_batch_interleave_num': 1, 'model': {'arch': {'type': 'GPT2LMHeadModel'}, 'model_config': {'attention_dropout_rate': 0.1, 'checkpoint_name_or_path': None, 'compute_dtype': 'float16', 'do_sample': True, 'eos_token_id': 50256, 'expand_ratio': 4, 'hidden_act': 'gelu', 'hidden_dropout_rate': 0.1, 'hidden_size': 768, 'layernorm_compute_type': 'float32', 'max_decode_length': 1024, 'num_heads': 12, 'num_layers': 12, 'param_init_type': 'float32', 'repetition_penalty': 1, 'seq_length': 1024, 'softmax_compute_type': 'float32', 'top_k': 5, 'top_p': 1, 'type': 'GPT2Config', 'use_flash_attention': False, 'use_past': False, 'use_prompt_flash_attention': False, 'vocab_size': 50257}}, 'moe_config': , 'only_save_strategy': False, 'optimizer': {'beta1': 0.9, 'beta2': 0.999, 'eps': 1e-08, 'type': 'fp32_adamw', 'weight_decay': 0.0}, 'output_dir': './output', 'overwrite_output_dir': False, 'parallel': {'dataset_strategy': 'full_batch', 'enable_alltoall': False, 'enable_parallel_optimizer': False, 'full_batch': True, 'gradients_mean': False, 'loss_repeated_mean': False, 'parallel_mode': 1, 'parallel_optimizer_config': {'gradient_accumulation_shard': False, 'optimizer_weight_shard_size': 1, 'parallel_optimizer_threshold': 64}, 'search_mode': 'sharding_propagation', 'strategy_ckpt_save_file': './output/strategy/ckpt_strategy_rank_0.ckpt'}, 'parallel_config': , 'processor': {'return_tensors': 'ms', 'tokenizer': {'bos_token': '<|endoftext|>', 'eos_token': '<|endoftext|>', 'pad_token': '<|endoftext|>', 'type': 'GPT2Tokenizer', 'unk_token': '<|endoftext|>'}, 'type': 'GPT2Processor'}, 'profile': False, 'profile_communication': False, 'profile_end_step': 10, 'profile_memory': True, 'profile_start_step': 1, 'profile_stop_step': 10, 'push_to_hub': False, 'rank_id': 0, 'recompute_config': , 'remote_save_url': 'Please input obs url on AICC platform.', 'resume_training': False, 'run_mode': 'train', 'runner_config': {'batch_size': 1024, 'epochs': 1, 'gradient_accumulation_steps': 1, 'initial_epoch': 0, 'initial_step': 0, 'origin_epochs': 1, 'sink_mode': False, 'sink_size': -1}, 'runner_wrapper': {'max_grad_norm': 1.0, 'micro_batch_num': 32, 'scale_sense': DynamicLossScaleUpdateCell<>, 'type': 'MFPipelineWithLossScaleCell', 'use_clip_grad': True}, 'seed': 42, 'src_strategy_path_or_dir': None, 'train_dataset': {'auto_tune': False, 'autotune_per_step': 10, 'batch_size': 1024, 'data_loader': {'dataset_dir': '', 'shuffle': True, 'type': 'MindDataset'}, 'do_eval': False, 'drop_remainder': True, 'filepath_prefix': './autotune', 'input_columns': ['input_ids', 'attention_mask'], 'num_parallel_workers': 8, 'numa_enable': False, 'output_columns': ['input_ids', 'attention_mask'], 'prefetch_size': 1, 'profile': False, 'python_multiprocessing': False, 'repeat': 1, 'seed': 42}, 'train_dataset_task': {'dataset_config': {'auto_tune': False, 'autotune_per_step': 10, 'batch_size': 1024, 'data_loader': {'dataset_dir': '', 'shuffle': True, 'type': 'MindDataset'}, 'do_eval': False, 'drop_remainder': True, 'filepath_prefix': './autotune', 'input_columns': ['input_ids', 'attention_mask'], 'num_parallel_workers': 8, 'numa_enable': False, 'output_columns': ['input_ids', 'attention_mask'], 'prefetch_size': 1, 'profile': False, 'python_multiprocessing': False, 'repeat': 1, 'seed': 42}, 'type': 'CausalLanguageModelDataset'}, 'trainer': {'model_name': 'gpt2', 'type': 'CausalLanguageModelingTrainer'}, 'transform_process_num': 1, 'use_parallel': True} 2024-12-27 13:12:35,649 - mindformers[mindformers/trainer/base_trainer.py:787] - INFO - .........Model Compiling, Please Wait a Moment........... [WARNING] ME(15286:281473333435408,MainProcess):2024-12-27-13:12:35.650.253 [mindspore/train/model.py:1419] For MFLossMonitor callback, {'epoch_begin', 'epoch_end', 'step_begin', 'step_end'} methods may not be supported in later version, Use methods prefixed with 'on_train' or 'on_eval' instead when using customized callbacks. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.857.081 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo3838-RmsNormInfo3737 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.858.564 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo2525-RmsNormInfo2424 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.859.555 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo3838-RmsNormInfo3737. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.859.606 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo3838-RmsNormInfo3737 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.861.177 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo8585-RmsNormInfo8484 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.176 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo2525-RmsNormInfo2424. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.206 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo2525-RmsNormInfo2424 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.298 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo3535, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.323 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo3636-ReshapeInfo3535 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.403 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo4747, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.424 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo3636-ReshapeInfo4747 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.502 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo5656, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.862.522 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo3636-ReshapeInfo5656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.864.039 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo7272-RmsNormInfo7171 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.864.457 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo8585-RmsNormInfo8484. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.864.484 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo8585-RmsNormInfo8484 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.866.745 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo132132-RmsNormInfo131131 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.867.686 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo7272-RmsNormInfo7171. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.867.715 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo7272-RmsNormInfo7171 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.867.811 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo8282, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.867.835 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo8383-ReshapeInfo8282 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.867.911 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo9494, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.867.932 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo8383-ReshapeInfo9494 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.868.007 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo103103, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.868.028 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo8383-ReshapeInfo103103 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.870.588 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo119119-RmsNormInfo118118 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.871.041 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo132132-RmsNormInfo131131. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.871.068 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo132132-RmsNormInfo131131 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.874.417 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo5858's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.874.450 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo5959-ReshapeInfo5858 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.875.120 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo179179-RmsNormInfo178178 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.022 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo119119-RmsNormInfo118118. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.050 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo119119-RmsNormInfo118118 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.154 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo129129, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.179 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo130130-ReshapeInfo129129 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.259 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo141141, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.280 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo130130-ReshapeInfo141141 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.356 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo150150, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.876.376 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo130130-ReshapeInfo150150 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.878.548 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo5858's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.878.579 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo5959-ReshapeInfo5858 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.878.912 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo1919's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.878.930 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo2020-ReshapeInfo1919 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.879.065 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo1919's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.879.091 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo2020-ReshapeInfo1919 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.879.898 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo166166-RmsNormInfo165165 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.880.331 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo179179-RmsNormInfo178178. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.880.356 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo179179-RmsNormInfo178178 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.883.136 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo105105's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.883.167 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo106106-ReshapeInfo105105 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.883.826 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo226226-RmsNormInfo225225 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.884.710 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo166166-RmsNormInfo165165. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.884.737 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo166166-RmsNormInfo165165 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.884.830 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo176176, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.884.854 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo177177-ReshapeInfo176176 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.884.928 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo188188, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.884.960 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo177177-ReshapeInfo188188 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.885.034 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo197197, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.885.056 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo177177-ReshapeInfo197197 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.887.184 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo105105's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.887.215 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo106106-ReshapeInfo105105 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.887.516 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo6666's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.887.533 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo6767-ReshapeInfo6666 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.887.667 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo6666's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.887.680 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo6767-ReshapeInfo6666 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.888.445 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo213213-RmsNormInfo212212 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.888.896 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo226226-RmsNormInfo225225. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.888.922 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo226226-RmsNormInfo225225 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.891.640 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo152152's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.891.671 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo153153-ReshapeInfo152152 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.892.329 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo273273-RmsNormInfo272272 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.216 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo213213-RmsNormInfo212212. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.243 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo213213-RmsNormInfo212212 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.331 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo223223, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.354 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo224224-ReshapeInfo223223 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.435 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo235235, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.456 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo224224-ReshapeInfo235235 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.531 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo244244, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.893.551 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo224224-ReshapeInfo244244 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.895.649 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo152152's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.895.678 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo153153-ReshapeInfo152152 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.895.976 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo113113's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.895.992 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo114114-ReshapeInfo113113 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.896.123 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo113113's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.896.136 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo114114-ReshapeInfo113113 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.896.898 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo260260-RmsNormInfo259259 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.897.327 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo273273-RmsNormInfo272272. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.897.351 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo273273-RmsNormInfo272272 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.900.053 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo199199's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.900.083 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo200200-ReshapeInfo199199 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.900.737 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo320320-RmsNormInfo319319 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.613 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo260260-RmsNormInfo259259. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.641 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo260260-RmsNormInfo259259 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.778 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo270270, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.803 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo271271-ReshapeInfo270270 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.881 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo282282, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.903 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo271271-ReshapeInfo282282 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.901.983 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo291291, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.902.004 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo271271-ReshapeInfo291291 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.904.031 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo199199's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.904.072 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo200200-ReshapeInfo199199 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.904.378 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo160160's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.904.396 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo161161-ReshapeInfo160160 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.904.528 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo160160's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.904.541 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo161161-ReshapeInfo160160 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.905.291 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo307307-RmsNormInfo306306 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.905.764 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo320320-RmsNormInfo319319. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.905.790 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo320320-RmsNormInfo319319 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.908.429 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo246246's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.908.467 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo247247-ReshapeInfo246246 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.909.085 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo367367-RmsNormInfo366366 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.032 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo307307-RmsNormInfo306306. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.061 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo307307-RmsNormInfo306306 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.146 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo317317, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.169 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo318318-ReshapeInfo317317 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.246 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo329329, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.267 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo318318-ReshapeInfo329329 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.340 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo338338, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.910.361 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo318318-ReshapeInfo338338 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.912.395 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo246246's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.912.424 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo247247-ReshapeInfo246246 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.912.736 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo207207's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.912.764 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo208208-ReshapeInfo207207 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.912.898 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo207207's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.912.912 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo208208-ReshapeInfo207207 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.913.724 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo354354-RmsNormInfo353353 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.914.167 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo367367-RmsNormInfo366366. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.914.192 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo367367-RmsNormInfo366366 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.916.830 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo293293's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.916.859 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo294294-ReshapeInfo293293 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.917.500 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo414414-RmsNormInfo413413 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.457 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo354354-RmsNormInfo353353. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.487 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo354354-RmsNormInfo353353 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.578 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo364364, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.601 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo365365-ReshapeInfo364364 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.676 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo376376, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.697 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo365365-ReshapeInfo376376 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.771 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo385385, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.918.791 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo365365-ReshapeInfo385385 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.920.833 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo293293's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.920.862 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo294294-ReshapeInfo293293 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.921.170 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo254254's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.921.199 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo255255-ReshapeInfo254254 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.921.333 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo254254's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.921.347 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo255255-ReshapeInfo254254 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.922.157 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo401401-RmsNormInfo400400 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.922.596 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo414414-RmsNormInfo413413. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.922.620 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo414414-RmsNormInfo413413 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.925.290 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo340340's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.925.320 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo341341-ReshapeInfo340340 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.925.997 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo461461-RmsNormInfo460460 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.926.850 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo401401-RmsNormInfo400400. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.926.877 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo401401-RmsNormInfo400400 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.926.959 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo411411, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.926.993 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo412412-ReshapeInfo411411 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.927.068 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo423423, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.927.089 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo412412-ReshapeInfo423423 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.927.164 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo432432, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.927.184 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo412412-ReshapeInfo432432 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.929.224 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo340340's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.929.255 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo341341-ReshapeInfo340340 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.929.545 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo301301's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.929.562 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo302302-ReshapeInfo301301 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.929.717 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo301301's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.929.743 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo302302-ReshapeInfo301301 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.930.495 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo448448-RmsNormInfo447447 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.930.907 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo461461-RmsNormInfo460460. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.930.931 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo461461-RmsNormInfo460460 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.933.622 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo387387's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.933.708 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo388388-ReshapeInfo387387 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.934.317 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo508508-RmsNormInfo507507 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.161 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo448448-RmsNormInfo447447. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.189 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo448448-RmsNormInfo447447 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.271 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo458458, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.294 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo459459-ReshapeInfo458458 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.368 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo470470, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.388 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo459459-ReshapeInfo470470 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.474 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo479479, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.935.494 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo459459-ReshapeInfo479479 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.937.499 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo387387's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.937.528 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo388388-ReshapeInfo387387 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.937.868 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo348348's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.937.886 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo349349-ReshapeInfo348348 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.938.018 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo348348's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.938.032 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo349349-ReshapeInfo348348 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.938.816 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo495495-RmsNormInfo494494 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.939.242 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo508508-RmsNormInfo507507. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.939.266 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo508508-RmsNormInfo507507 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.941.940 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo434434's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.941.969 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo435435-ReshapeInfo434434 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.942.606 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo555555-RmsNormInfo554554 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.497 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo495495-RmsNormInfo494494. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.524 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo495495-RmsNormInfo494494 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.607 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo505505, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.631 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo506506-ReshapeInfo505505 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.713 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo517517, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.734 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo506506-ReshapeInfo517517 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.805 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo526526, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.943.825 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo506506-ReshapeInfo526526 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.945.915 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo434434's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.945.947 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo435435-ReshapeInfo434434 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.946.232 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo395395's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.946.248 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo396396-ReshapeInfo395395 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.946.379 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo395395's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.946.392 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo396396-ReshapeInfo395395 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.947.147 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo542542-RmsNormInfo541541 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.947.569 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo555555-RmsNormInfo554554. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.947.594 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo555555-RmsNormInfo554554 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.950.299 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo481481's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.950.341 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo482482-ReshapeInfo481481 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.950.967 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo602602-RmsNormInfo601601 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.951.829 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo542542-RmsNormInfo541541. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.951.857 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo542542-RmsNormInfo541541 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.951.940 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo552552, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.951.963 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo553553-ReshapeInfo552552 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.952.038 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo564564, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.952.059 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo553553-ReshapeInfo564564 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.952.133 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo573573, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.952.153 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo553553-ReshapeInfo573573 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.954.223 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo481481's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.954.268 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo482482-ReshapeInfo481481 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.954.559 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo442442's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.954.575 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo443443-ReshapeInfo442442 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.954.706 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo442442's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.954.719 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo443443-ReshapeInfo442442 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.955.476 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo589589-RmsNormInfo588588 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.955.891 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo602602-RmsNormInfo601601. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.955.916 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo602602-RmsNormInfo601601 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.958.600 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo528528's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.958.639 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo529529-ReshapeInfo528528 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.959.255 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo649649-RmsNormInfo648648 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.119 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo589589-RmsNormInfo588588. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.147 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo589589-RmsNormInfo588588 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.233 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo599599, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.256 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo600600-ReshapeInfo599599 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.331 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo611611, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.352 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo600600-ReshapeInfo611611 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.427 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo620620, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.960.448 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo600600-ReshapeInfo620620 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.962.499 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo528528's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.962.529 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo529529-ReshapeInfo528528 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.962.835 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo489489's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.962.862 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo490490-ReshapeInfo489489 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.962.997 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo489489's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.963.010 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo490490-ReshapeInfo489489 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.963.765 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo636636-RmsNormInfo635635 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.964.182 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo649649-RmsNormInfo648648. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.964.208 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo649649-RmsNormInfo648648 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.966.855 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo575575's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.966.885 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo576576-ReshapeInfo575575 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.967.492 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo696696-RmsNormInfo695695 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.345 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo636636-RmsNormInfo635635. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.383 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo636636-RmsNormInfo635635 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.468 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo646646, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.492 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo647647-ReshapeInfo646646 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.572 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo658658, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.592 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo647647-ReshapeInfo658658 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.665 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo667667, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.968.685 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo647647-ReshapeInfo667667 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.970.726 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo575575's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.970.757 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo576576-ReshapeInfo575575 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.971.036 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo536536's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.971.054 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo537537-ReshapeInfo536536 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.971.198 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo536536's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.971.212 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo537537-ReshapeInfo536536 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.972.034 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo683683-RmsNormInfo682682 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.972.450 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo696696-RmsNormInfo695695. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.972.475 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo696696-RmsNormInfo695695 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.975.166 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo622622's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.975.196 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo623623-ReshapeInfo622622 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.975.851 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo743743-RmsNormInfo742742 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.976.714 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo683683-RmsNormInfo682682. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.976.742 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo683683-RmsNormInfo682682 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.976.828 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo693693, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.976.852 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo694694-ReshapeInfo693693 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.976.940 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo705705, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.976.961 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo694694-ReshapeInfo705705 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.977.035 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo714714, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.977.056 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo694694-ReshapeInfo714714 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.979.105 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo622622's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.979.135 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo623623-ReshapeInfo622622 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.979.426 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo583583's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.979.443 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo584584-ReshapeInfo583583 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.979.576 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo583583's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.979.599 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo584584-ReshapeInfo583583 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.980.371 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo730730-RmsNormInfo729729 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.980.808 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo743743-RmsNormInfo742742. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.980.833 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo743743-RmsNormInfo742742 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.983.489 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo669669's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.983.520 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo670670-ReshapeInfo669669 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.984.185 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo790790-RmsNormInfo789789 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.063 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo730730-RmsNormInfo729729. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.092 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo730730-RmsNormInfo729729 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.172 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo740740, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.194 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo741741-ReshapeInfo740740 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.270 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo752752, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.291 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo741741-ReshapeInfo752752 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.370 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo761761, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.985.403 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo741741-ReshapeInfo761761 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.987.467 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo669669's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.987.497 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo670670-ReshapeInfo669669 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.987.789 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo630630's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.987.805 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo631631-ReshapeInfo630630 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.987.937 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo630630's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.987.951 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo631631-ReshapeInfo630630 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.988.772 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo777777-RmsNormInfo776776 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.989.178 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo790790-RmsNormInfo789789. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.989.213 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo790790-RmsNormInfo789789 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.991.919 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo716716's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.991.950 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo717717-ReshapeInfo716716 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.992.593 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo837837-RmsNormInfo836836 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.536 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo777777-RmsNormInfo776776. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.564 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo777777-RmsNormInfo776776 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.691 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo787787, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.717 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo788788-ReshapeInfo787787 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.793 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo799799, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.814 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo788788-ReshapeInfo799799 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.887 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo808808, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.993.907 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo788788-ReshapeInfo808808 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.995.961 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo716716's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.996.002 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo717717-ReshapeInfo716716 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.996.298 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo677677's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.996.317 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo678678-ReshapeInfo677677 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.996.449 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo677677's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.996.462 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo678678-ReshapeInfo677677 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.997.249 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo824824-RmsNormInfo823823 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.997.727 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo837837-RmsNormInfo836836. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:44.997.753 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo837837-RmsNormInfo836836 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.000.410 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo763763's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.000.450 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo764764-ReshapeInfo763763 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.001.117 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo884884-RmsNormInfo883883 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.065 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo824824-RmsNormInfo823823. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.094 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo824824-RmsNormInfo823823 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.176 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo834834, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.199 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo835835-ReshapeInfo834834 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.275 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo846846, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.296 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo835835-ReshapeInfo846846 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.372 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo855855, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.002.393 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo835835-ReshapeInfo855855 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.004.402 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo763763's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.004.431 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo764764-ReshapeInfo763763 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.004.738 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo724724's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.004.755 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo725725-ReshapeInfo724724 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.004.887 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo724724's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.004.901 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo725725-ReshapeInfo724724 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.005.697 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo871871-RmsNormInfo870870 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.006.142 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo884884-RmsNormInfo883883. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.006.168 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo884884-RmsNormInfo883883 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.008.811 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo810810's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.008.841 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo811811-ReshapeInfo810810 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.009.482 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo931931-RmsNormInfo930930 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.399 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo871871-RmsNormInfo870870. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.428 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo871871-RmsNormInfo870870 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.511 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo881881, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.535 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo882882-ReshapeInfo881881 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.608 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo893893, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.629 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo882882-ReshapeInfo893893 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.701 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo902902, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.010.722 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo882882-ReshapeInfo902902 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.012.789 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo810810's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.012.818 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo811811-ReshapeInfo810810 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.013.124 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo771771's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.013.151 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo772772-ReshapeInfo771771 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.013.286 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo771771's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.013.300 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo772772-ReshapeInfo771771 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.014.101 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo918918-RmsNormInfo917917 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.014.537 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo931931-RmsNormInfo930930. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.014.562 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo931931-RmsNormInfo930930 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.017.299 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo857857's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.017.328 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo858858-ReshapeInfo857857 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.017.989 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo978978-RmsNormInfo977977 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.018.868 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo918918-RmsNormInfo917917. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.018.896 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo918918-RmsNormInfo917917 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.018.992 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo928928, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.019.015 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo929929-ReshapeInfo928928 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.019.086 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo940940, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.019.106 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo929929-ReshapeInfo940940 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.019.179 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo949949, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.019.200 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo929929-ReshapeInfo949949 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.021.224 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo857857's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.021.253 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo858858-ReshapeInfo857857 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.021.546 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo818818's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.021.563 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo819819-ReshapeInfo818818 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.021.734 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo818818's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.021.759 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo819819-ReshapeInfo818818 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.022.518 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo965965-RmsNormInfo964964 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.022.925 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo978978-RmsNormInfo977977. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.022.950 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo978978-RmsNormInfo977977 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.025.608 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo904904's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.025.636 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo905905-ReshapeInfo904904 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.026.289 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo10251025-RmsNormInfo10241024 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.179 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo965965-RmsNormInfo964964. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.207 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo965965-RmsNormInfo964964 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.289 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo975975, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.312 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo976976-ReshapeInfo975975 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.388 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo987987, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.419 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo976976-ReshapeInfo987987 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.497 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo996996, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.027.518 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo976976-ReshapeInfo996996 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.029.546 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo904904's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.029.577 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo905905-ReshapeInfo904904 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.029.924 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo865865's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.029.942 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo866866-ReshapeInfo865865 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.030.074 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo865865's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.030.087 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo866866-ReshapeInfo865865 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.030.883 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo10121012-RmsNormInfo10111011 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.031.302 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo10251025-RmsNormInfo10241024. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.031.328 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo10251025-RmsNormInfo10241024 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.034.019 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo951951's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.034.049 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo952952-ReshapeInfo951951 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.034.726 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo10721072-RmsNormInfo10711071 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.582 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo10121012-RmsNormInfo10111011. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.609 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo10121012-RmsNormInfo10111011 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.694 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo10221022, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.717 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo10231023-ReshapeInfo10221022 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.793 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo10341034, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.814 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo10231023-ReshapeInfo10341034 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.894 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo10431043, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.035.926 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo10231023-ReshapeInfo10431043 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.037.980 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo951951's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.038.010 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo952952-ReshapeInfo951951 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.038.303 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo912912's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.038.320 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo913913-ReshapeInfo912912 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.038.453 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo912912's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.038.466 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo913913-ReshapeInfo912912 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.039.221 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo10591059-RmsNormInfo10581058 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.039.661 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo10721072-RmsNormInfo10711071. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.039.699 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo10721072-RmsNormInfo10711071 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.042.374 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo998998's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.042.405 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo999999-ReshapeInfo998998 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.043.038 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo11191119-RmsNormInfo11181118 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.043.971 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo10591059-RmsNormInfo10581058. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.043.999 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo10591059-RmsNormInfo10581058 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.044.089 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo10691069, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.044.113 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo10701070-ReshapeInfo10691069 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.044.193 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo10811081, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.044.215 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo10701070-ReshapeInfo10811081 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.044.288 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo10901090, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.044.308 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo10701070-ReshapeInfo10901090 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.046.378 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo998998's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.046.420 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo999999-ReshapeInfo998998 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.046.709 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo959959's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.046.727 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo960960-ReshapeInfo959959 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.046.860 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo959959's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.046.874 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo960960-ReshapeInfo959959 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.047.649 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo11061106-RmsNormInfo11051105 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.048.129 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo11191119-RmsNormInfo11181118. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.048.154 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo11191119-RmsNormInfo11181118 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.050.870 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10451045's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.050.912 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo10461046-ReshapeInfo10451045 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.051.556 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo11661166-RmsNormInfo11651165 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.464 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo11061106-RmsNormInfo11051105. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.492 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo11061106-RmsNormInfo11051105 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.578 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo11161116, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.601 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo11171117-ReshapeInfo11161116 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.679 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo11281128, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.700 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo11171117-ReshapeInfo11281128 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.774 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo11371137, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.052.794 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo11171117-ReshapeInfo11371137 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.054.845 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10451045's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.054.875 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo10461046-ReshapeInfo10451045 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.055.194 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10061006's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.055.213 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo10071007-ReshapeInfo10061006 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.055.345 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10061006's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.055.358 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo10071007-ReshapeInfo10061006 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.056.097 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo11531153-RmsNormInfo11521152 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.056.521 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo11661166-RmsNormInfo11651165. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.056.545 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo11661166-RmsNormInfo11651165 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.059.241 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10921092's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.059.274 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo10931093-ReshapeInfo10921092 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.059.946 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo12131213-RmsNormInfo12121212 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.060.804 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo11531153-RmsNormInfo11521152. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.060.833 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo11531153-RmsNormInfo11521152 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.060.924 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo11631163, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.060.949 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo11641164-ReshapeInfo11631163 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.061.033 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo11751175, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.061.056 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo11641164-ReshapeInfo11751175 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.061.130 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo11841184, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.061.151 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo11641164-ReshapeInfo11841184 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.063.227 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10921092's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.063.258 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo10931093-ReshapeInfo10921092 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.063.546 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10531053's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.063.574 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo10541054-ReshapeInfo10531053 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.063.708 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo10531053's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.063.721 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo10541054-ReshapeInfo10531053 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.064.397 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo12001200-RmsNormInfo11991199 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.064.773 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo12131213-RmsNormInfo12121212. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.064.796 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo12131213-RmsNormInfo12121212 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.067.490 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11391139's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.067.520 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo11401140-ReshapeInfo11391139 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.068.038 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo12601260-RmsNormInfo12591259 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.068.880 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo12001200-RmsNormInfo11991199. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.068.907 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo12001200-RmsNormInfo11991199 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.068.992 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo12101210, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.069.015 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo12111211-ReshapeInfo12101210 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.069.083 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo12221222, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.069.104 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo12111211-ReshapeInfo12221222 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.069.171 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo12311231, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.069.191 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo12111211-ReshapeInfo12311231 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.071.287 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11391139's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.071.317 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo11401140-ReshapeInfo11391139 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.071.613 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11001100's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.071.629 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo11011101-ReshapeInfo11001100 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.071.761 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11001100's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.071.784 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo11011101-ReshapeInfo11001100 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.072.444 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo12471247-RmsNormInfo12461246 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.072.747 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo12601260-RmsNormInfo12591259. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.072.769 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo12601260-RmsNormInfo12591259 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.075.417 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11861186's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.075.448 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo11871187-ReshapeInfo11861186 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.075.968 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo13071307-RmsNormInfo13061306 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.672 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo12471247-RmsNormInfo12461246. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.698 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo12471247-RmsNormInfo12461246 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.771 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo12571257, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.793 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo12581258-ReshapeInfo12571257 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.871 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo12691269, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.891 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo12581258-ReshapeInfo12691269 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.956 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo12781278, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.076.976 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo12581258-ReshapeInfo12781278 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.078.916 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11861186's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.078.947 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo11871187-ReshapeInfo11861186 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.079.234 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11471147's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.079.252 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo11481148-ReshapeInfo11471147 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.079.384 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11471147's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.079.407 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo11481148-ReshapeInfo11471147 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.080.065 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo12941294-RmsNormInfo12931293 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.080.373 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo13071307-RmsNormInfo13061306. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.080.394 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo13071307-RmsNormInfo13061306 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.082.929 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12331233's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.082.960 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo12341234-ReshapeInfo12331233 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.083.464 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo13541354-RmsNormInfo13531353 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.162 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo12941294-RmsNormInfo12931293. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.188 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo12941294-RmsNormInfo12931293 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.260 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13041304, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.282 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13051305-ReshapeInfo13041304 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.345 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13161316, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.365 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13051305-ReshapeInfo13161316 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.430 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13251325, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.084.465 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13051305-ReshapeInfo13251325 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.086.231 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12331233's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.086.261 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo12341234-ReshapeInfo12331233 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.086.545 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11941194's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.086.562 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo11951195-ReshapeInfo11941194 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.086.694 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo11941194's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.086.707 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo11951195-ReshapeInfo11941194 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.087.361 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo13411341-RmsNormInfo13401340 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.087.663 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo13541354-RmsNormInfo13531353. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.087.694 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo13541354-RmsNormInfo13531353 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.090.171 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12801280's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.090.201 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo12811281-ReshapeInfo12801280 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.090.712 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo14011401-RmsNormInfo14001400 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.404 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo13411341-RmsNormInfo13401340. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.429 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo13411341-RmsNormInfo13401340 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.503 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13511351, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.526 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13521352-ReshapeInfo13511351 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.591 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13631363, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.611 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13521352-ReshapeInfo13631363 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.675 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13721372, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.091.694 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13521352-ReshapeInfo13721372 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.093.438 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12801280's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.093.480 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo12811281-ReshapeInfo12801280 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.093.797 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12411241's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.093.816 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo12421242-ReshapeInfo12411241 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.093.948 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12411241's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.093.961 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo12421242-ReshapeInfo12411241 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.094.611 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo13881388-RmsNormInfo13871387 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.094.915 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo14011401-RmsNormInfo14001400. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.094.936 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo14011401-RmsNormInfo14001400 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.097.358 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13271327's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.097.395 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo13281328-ReshapeInfo13271327 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.097.921 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo14481448-RmsNormInfo14471447 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.618 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo13881388-RmsNormInfo13871387. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.644 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo13881388-RmsNormInfo13871387 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.717 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo13981398, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.739 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13991399-ReshapeInfo13981398 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.805 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo14101410, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.824 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13991399-ReshapeInfo14101410 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.889 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo14191419, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.098.908 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo13991399-ReshapeInfo14191419 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.100.645 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13271327's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.100.683 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo13281328-ReshapeInfo13271327 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.100.976 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12881288's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.100.993 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo12891289-ReshapeInfo12881288 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.101.125 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo12881288's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.101.138 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo12891289-ReshapeInfo12881288 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.101.836 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo14351435-RmsNormInfo14341434 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.102.141 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo14481448-RmsNormInfo14471447. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.102.162 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo14481448-RmsNormInfo14471447 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.104.569 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13741374's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.104.597 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo13751375-ReshapeInfo13741374 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.105.119 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo14951495-RmsNormInfo14941494 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.105.844 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo14351435-RmsNormInfo14341434. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.105.871 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo14351435-RmsNormInfo14341434 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.105.942 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo14451445, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.105.964 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo14461446-ReshapeInfo14451445 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.106.029 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo14571457, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.106.049 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo14461446-ReshapeInfo14571457 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.106.115 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo14661466, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.106.134 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo14461446-ReshapeInfo14661466 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.107.886 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13741374's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.107.913 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo13751375-ReshapeInfo13741374 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.108.206 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13351335's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.108.231 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo13361336-ReshapeInfo13351335 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.108.364 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13351335's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.108.377 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo13361336-ReshapeInfo13351335 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.109.031 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo14821482-RmsNormInfo14811481 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.109.331 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo14951495-RmsNormInfo14941494. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.109.352 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo14951495-RmsNormInfo14941494 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.111.810 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14211421's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.111.840 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo14221422-ReshapeInfo14211421 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.112.348 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo15421542-RmsNormInfo15411541 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.040 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo14821482-RmsNormInfo14811481. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.075 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo14821482-RmsNormInfo14811481 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.147 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo14921492, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.169 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo14931493-ReshapeInfo14921492 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.234 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15041504, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.254 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo14931493-ReshapeInfo15041504 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.318 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15131513, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.113.336 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo14931493-ReshapeInfo15131513 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.115.101 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14211421's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.115.132 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo14221422-ReshapeInfo14211421 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.115.399 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13821382's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.115.414 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo13831383-ReshapeInfo13821382 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.115.562 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo13821382's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.115.576 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo13831383-ReshapeInfo13821382 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.116.226 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo15291529-RmsNormInfo15281528 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.116.525 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo15421542-RmsNormInfo15411541. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.116.546 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo15421542-RmsNormInfo15411541 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.118.995 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14681468's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.119.024 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo14691469-ReshapeInfo14681468 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.119.531 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo15891589-RmsNormInfo15881588 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.230 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo15291529-RmsNormInfo15281528. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.255 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo15291529-RmsNormInfo15281528 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.327 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15391539, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.349 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo15401540-ReshapeInfo15391539 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.425 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15511551, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.445 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo15401540-ReshapeInfo15511551 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.509 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15601560, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.120.529 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo15401540-ReshapeInfo15601560 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.122.320 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14681468's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.122.350 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo14691469-ReshapeInfo14681468 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.122.622 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14291429's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.122.638 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo14301430-ReshapeInfo14291429 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.122.770 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14291429's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.122.794 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo14301430-ReshapeInfo14291429 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.123.449 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo15761576-RmsNormInfo15751575 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.123.754 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo15891589-RmsNormInfo15881588. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.123.776 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo15891589-RmsNormInfo15881588 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.126.232 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15151515's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.126.261 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo15161516-ReshapeInfo15151515 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.126.767 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo16361636-RmsNormInfo16351635 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.464 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo15761576-RmsNormInfo15751575. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.489 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo15761576-RmsNormInfo15751575 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.562 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15861586, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.584 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo15871587-ReshapeInfo15861586 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.649 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo15981598, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.669 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo15871587-ReshapeInfo15981598 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.743 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo16071607, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.127.763 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo15871587-ReshapeInfo16071607 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.129.503 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15151515's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.129.531 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo15161516-ReshapeInfo15151515 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.129.844 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14761476's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.129.863 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo14771477-ReshapeInfo14761476 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.129.994 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo14761476's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.130.008 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo14771477-ReshapeInfo14761476 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.130.662 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo16231623-RmsNormInfo16221622 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.130.980 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo16361636-RmsNormInfo16351635. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.131.002 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo16361636-RmsNormInfo16351635 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.133.422 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15621562's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.133.449 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo15631563-ReshapeInfo15621562 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.014 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo16831683-RmsNormInfo16821682 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.710 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo16231623-RmsNormInfo16221622. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.735 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo16231623-RmsNormInfo16221622 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.806 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo16331633, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.828 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo16341634-ReshapeInfo16331633 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.895 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo16451645, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.915 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo16341634-ReshapeInfo16451645 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.979 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo16541654, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.134.998 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo16341634-ReshapeInfo16541654 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.136.745 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15621562's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.136.784 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo15631563-ReshapeInfo15621562 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.137.055 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15231523's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.137.072 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo15241524-ReshapeInfo15231523 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.137.202 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15231523's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.137.216 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo15241524-ReshapeInfo15231523 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.137.908 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo16701670-RmsNormInfo16691669 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.138.214 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo16831683-RmsNormInfo16821682. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.138.235 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo16831683-RmsNormInfo16821682 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.140.645 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16091609's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.140.683 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo16101610-ReshapeInfo16091609 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.141.188 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo17301730-RmsNormInfo17291729 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.141.928 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo16701670-RmsNormInfo16691669. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.141.954 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo16701670-RmsNormInfo16691669 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.142.025 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo16801680, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.142.048 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo16811681-ReshapeInfo16801680 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.142.113 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo16921692, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.142.133 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo16811681-ReshapeInfo16921692 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.142.197 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17011701, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.142.216 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo16811681-ReshapeInfo17011701 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.143.956 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16091609's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.143.993 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo16101610-ReshapeInfo16091609 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.144.266 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15701570's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.144.282 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo15711571-ReshapeInfo15701570 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.144.413 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo15701570's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.144.425 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo15711571-ReshapeInfo15701570 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.145.070 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo17171717-RmsNormInfo17161716 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.145.375 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo17301730-RmsNormInfo17291729. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.145.397 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo17301730-RmsNormInfo17291729 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.147.868 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16561656's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.147.908 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo16571657-ReshapeInfo16561656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.148.418 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo17771777-RmsNormInfo17761776 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.113 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo17171717-RmsNormInfo17161716. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.137 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo17171717-RmsNormInfo17161716 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.210 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17271727, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.233 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo17281728-ReshapeInfo17271727 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.300 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17391739, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.319 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo17281728-ReshapeInfo17391739 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.383 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17481748, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.149.403 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo17281728-ReshapeInfo17481748 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.151.188 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16561656's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.151.220 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo16571657-ReshapeInfo16561656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.151.487 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16171617's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.151.513 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo16181618-ReshapeInfo16171617 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.151.646 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16171617's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.151.659 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo16181618-ReshapeInfo16171617 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.152.311 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo17641764-RmsNormInfo17631763 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.152.613 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo17771777-RmsNormInfo17761776. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.152.633 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo17771777-RmsNormInfo17761776 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.155.086 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17031703's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.155.116 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo17041704-ReshapeInfo17031703 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.155.620 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo18241824-RmsNormInfo18231823 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.303 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo17641764-RmsNormInfo17631763. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.339 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo17641764-RmsNormInfo17631763 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.411 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17741774, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.433 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo17751775-ReshapeInfo17741774 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.499 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17861786, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.519 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo17751775-ReshapeInfo17861786 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.583 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo17951795, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.156.602 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo17751775-ReshapeInfo17951795 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.158.404 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17031703's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.158.434 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo17041704-ReshapeInfo17031703 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.158.701 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16641664's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.158.716 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo16651665-ReshapeInfo16641664 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.158.859 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo16641664's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.158.872 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo16651665-ReshapeInfo16641664 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.159.519 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo18111811-RmsNormInfo18101810 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.159.822 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo18241824-RmsNormInfo18231823. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.159.842 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo18241824-RmsNormInfo18231823 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.162.284 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17501750's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.162.314 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo17511751-ReshapeInfo17501750 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.162.818 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo18711871-RmsNormInfo18701870 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.509 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo18111811-RmsNormInfo18101810. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.533 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo18111811-RmsNormInfo18101810 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.606 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo18211821, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.638 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo18221822-ReshapeInfo18211821 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.705 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo18331833, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.725 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo18221822-ReshapeInfo18331833 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.789 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo18421842, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.163.809 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo18221822-ReshapeInfo18421842 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.165.571 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17501750's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.165.599 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo17511751-ReshapeInfo17501750 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.165.896 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17111711's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.165.915 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo17121712-ReshapeInfo17111711 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.166.046 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17111711's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.166.071 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo17121712-ReshapeInfo17111711 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.166.720 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo18581858-RmsNormInfo18571857 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.167.024 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo18711871-RmsNormInfo18701870. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.167.046 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo18711871-RmsNormInfo18701870 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.169.455 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17971797's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.169.484 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo17981798-ReshapeInfo17971797 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.047 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo19181918-RmsNormInfo19171917 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.743 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo18581858-RmsNormInfo18571857. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.768 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo18581858-RmsNormInfo18571857 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.841 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo18681868, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.863 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo18691869-ReshapeInfo18681868 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.927 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo18801880, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.170.958 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo18691869-ReshapeInfo18801880 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.171.024 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo18891889, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.171.044 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo18691869-ReshapeInfo18891889 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.172.804 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17971797's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.172.833 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo17981798-ReshapeInfo17971797 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.173.103 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17581758's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.173.119 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo17591759-ReshapeInfo17581758 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.173.250 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo17581758's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.173.263 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo17591759-ReshapeInfo17581758 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.173.941 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo19051905-RmsNormInfo19041904 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.174.261 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo19181918-RmsNormInfo19171917. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.174.282 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo19181918-RmsNormInfo19171917 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.176.699 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18441844's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.176.727 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo18451845-ReshapeInfo18441844 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.177.230 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo19651965-RmsNormInfo19641964 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.177.961 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo19051905-RmsNormInfo19041904. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.177.987 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo19051905-RmsNormInfo19041904 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.178.058 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo19151915, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.178.081 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo19161916-ReshapeInfo19151915 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.178.145 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo19271927, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.178.165 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo19161916-ReshapeInfo19271927 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.178.229 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo19361936, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.178.248 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo19161916-ReshapeInfo19361936 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.179.991 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18441844's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.180.021 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo18451845-ReshapeInfo18441844 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.180.314 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18051805's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.180.333 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo18061806-ReshapeInfo18051805 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.180.465 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18051805's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.180.478 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo18061806-ReshapeInfo18051805 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.181.117 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo19521952-RmsNormInfo19511951 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.181.419 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo19651965-RmsNormInfo19641964. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.181.441 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo19651965-RmsNormInfo19641964 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.183.914 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18911891's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.183.943 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo18921892-ReshapeInfo18911891 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.184.446 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo20122012-RmsNormInfo20112011 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.150 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo19521952-RmsNormInfo19511951. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.175 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo19521952-RmsNormInfo19511951 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.247 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo19621962, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.269 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo19631963-ReshapeInfo19621962 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.335 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo19741974, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.355 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo19631963-ReshapeInfo19741974 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.420 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo19831983, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.185.439 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo19631963-ReshapeInfo19831983 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.187.201 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18911891's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.187.245 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo18921892-ReshapeInfo18911891 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.187.535 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18521852's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.187.552 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo18531853-ReshapeInfo18521852 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.187.683 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18521852's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.187.696 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo18531853-ReshapeInfo18521852 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.188.343 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo19991999-RmsNormInfo19981998 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.188.649 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo20122012-RmsNormInfo20112011. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.188.670 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo20122012-RmsNormInfo20112011 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.191.149 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19381938's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.191.191 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo19391939-ReshapeInfo19381938 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.191.702 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo20592059-RmsNormInfo20582058 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.410 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo19991999-RmsNormInfo19981998. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.434 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo19991999-RmsNormInfo19981998 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.506 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo20092009, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.528 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo20102010-ReshapeInfo20092009 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.592 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo20212021, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.612 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo20102010-ReshapeInfo20212021 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.676 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo20302030, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.192.695 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo20102010-ReshapeInfo20302030 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.194.505 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19381938's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.194.535 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo19391939-ReshapeInfo19381938 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.194.823 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18991899's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.194.851 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo19001900-ReshapeInfo18991899 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.194.984 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo18991899's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.194.998 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo19001900-ReshapeInfo18991899 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.195.653 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo20462046-RmsNormInfo20452045 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.195.966 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo20592059-RmsNormInfo20582058. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.195.989 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo20592059-RmsNormInfo20582058 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.198.463 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19851985's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.198.494 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo19861986-ReshapeInfo19851985 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.002 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo21062106-RmsNormInfo21052105 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.705 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo20462046-RmsNormInfo20452045. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.732 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo20462046-RmsNormInfo20452045 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.806 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo20562056, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.828 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo20572057-ReshapeInfo20562056 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.893 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo20682068, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.913 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo20572057-ReshapeInfo20682068 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.977 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo20772077, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.199.996 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo20572057-ReshapeInfo20772077 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.201.801 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19851985's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.201.831 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo19861986-ReshapeInfo19851985 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.202.100 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19461946's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.202.127 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo19471947-ReshapeInfo19461946 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.202.261 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19461946's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.202.275 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo19471947-ReshapeInfo19461946 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.202.930 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo20932093-RmsNormInfo20922092 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.203.236 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo21062106-RmsNormInfo21052105. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.203.258 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo21062106-RmsNormInfo21052105 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.205.725 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20322032's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.205.753 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo20332033-ReshapeInfo20322032 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.206.258 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo21532153-RmsNormInfo21522152 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.206.952 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo20932093-RmsNormInfo20922092. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.206.976 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo20932093-RmsNormInfo20922092 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.207.047 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21032103, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.207.080 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21042104-ReshapeInfo21032103 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.207.148 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21152115, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.207.168 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21042104-ReshapeInfo21152115 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.207.231 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21242124, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.207.251 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21042104-ReshapeInfo21242124 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.208.994 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20322032's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.209.023 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo20332033-ReshapeInfo20322032 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.209.297 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19931993's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.209.313 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo19941994-ReshapeInfo19931993 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.209.445 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo19931993's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.209.471 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo19941994-ReshapeInfo19931993 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.210.175 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo21402140-RmsNormInfo21392139 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.210.485 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo21532153-RmsNormInfo21522152. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.210.507 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo21532153-RmsNormInfo21522152 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.212.938 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20792079's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.212.966 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo20802080-ReshapeInfo20792079 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.213.475 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo22002200-RmsNormInfo21992199 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.212 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo21402140-RmsNormInfo21392139. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.240 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo21402140-RmsNormInfo21392139 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.312 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21502150, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.334 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21512151-ReshapeInfo21502150 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.399 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21622162, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.430 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21512151-ReshapeInfo21622162 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.497 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21712171, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.214.517 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21512151-ReshapeInfo21712171 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.216.253 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20792079's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.216.282 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo20802080-ReshapeInfo20792079 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.216.553 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20402040's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.216.569 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo20412041-ReshapeInfo20402040 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.216.700 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20402040's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.216.713 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo20412041-ReshapeInfo20402040 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.217.375 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo21872187-RmsNormInfo21862186 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.217.704 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo22002200-RmsNormInfo21992199. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.217.729 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo22002200-RmsNormInfo21992199 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.220.161 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21262126's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.220.189 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo21272127-ReshapeInfo21262126 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.220.696 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo22472247-RmsNormInfo22462246 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.397 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo21872187-RmsNormInfo21862186. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.422 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo21872187-RmsNormInfo21862186 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.492 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo21972197, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.514 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21982198-ReshapeInfo21972197 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.578 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo22092209, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.597 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21982198-ReshapeInfo22092209 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.726 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo22182218, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.221.759 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo21982198-ReshapeInfo22182218 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.223.498 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21262126's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.223.526 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo21272127-ReshapeInfo21262126 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.223.796 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20872087's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.223.812 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo20882088-ReshapeInfo20872087 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.223.943 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo20872087's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.223.956 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo20882088-ReshapeInfo20872087 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.224.610 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo22342234-RmsNormInfo22332233 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.224.915 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo22472247-RmsNormInfo22462246. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.224.947 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo22472247-RmsNormInfo22462246 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.227.439 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21732173's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.227.469 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo21742174-ReshapeInfo21732173 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.227.979 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo22942294-RmsNormInfo22932293 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.674 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo22342234-RmsNormInfo22332233. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.699 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo22342234-RmsNormInfo22332233 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.770 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo22442244, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.792 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo22452245-ReshapeInfo22442244 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.857 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo22562256, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.879 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo22452245-ReshapeInfo22562256 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.944 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo22652265, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.228.963 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo22452245-ReshapeInfo22652265 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.230.754 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21732173's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.230.796 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo21742174-ReshapeInfo21732173 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.231.063 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21342134's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.231.081 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo21352135-ReshapeInfo21342134 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.231.211 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21342134's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.231.224 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo21352135-ReshapeInfo21342134 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.231.880 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo22812281-RmsNormInfo22802280 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.232.187 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo22942294-RmsNormInfo22932293. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.232.208 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo22942294-RmsNormInfo22932293 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.234.703 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22202220's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.234.743 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo22212221-ReshapeInfo22202220 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.235.248 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo23412341-RmsNormInfo23402340 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.235.952 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo22812281-RmsNormInfo22802280. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.235.977 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo22812281-RmsNormInfo22802280 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.236.049 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo22912291, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.236.072 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo22922292-ReshapeInfo22912291 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.236.138 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23032303, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.236.158 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo22922292-ReshapeInfo23032303 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.236.224 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23122312, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.236.243 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo22922292-ReshapeInfo23122312 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.238.035 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22202220's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.238.065 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo22212221-ReshapeInfo22202220 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.238.350 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21812181's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.238.367 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo21822182-ReshapeInfo21812181 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.238.497 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo21812181's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.238.510 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo21822182-ReshapeInfo21812181 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.239.162 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo23282328-RmsNormInfo23272327 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.239.468 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo23412341-RmsNormInfo23402340. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.239.489 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo23412341-RmsNormInfo23402340 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.241.973 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22672267's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.242.005 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo22682268-ReshapeInfo22672267 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.242.534 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo23882388-RmsNormInfo23872387 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.221 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo23282328-RmsNormInfo23272327. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.247 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo23282328-RmsNormInfo23272327 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.319 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23382338, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.341 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo23392339-ReshapeInfo23382338 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.406 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23502350, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.426 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo23392339-ReshapeInfo23502350 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.491 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23592359, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.243.510 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo23392339-ReshapeInfo23592359 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.245.279 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22672267's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.245.308 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo22682268-ReshapeInfo22672267 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.245.580 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22282228's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.245.610 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo22292229-ReshapeInfo22282228 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.245.785 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22282228's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.245.801 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo22292229-ReshapeInfo22282228 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.246.454 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo23752375-RmsNormInfo23742374 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.246.759 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo23882388-RmsNormInfo23872387. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.246.781 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo23882388-RmsNormInfo23872387 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.249.193 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23142314's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.249.222 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo23152315-ReshapeInfo23142314 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.249.784 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo24352435-RmsNormInfo24342434 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.481 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo23752375-RmsNormInfo23742374. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.505 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo23752375-RmsNormInfo23742374 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.595 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23852385, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.620 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo23862386-ReshapeInfo23852385 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.687 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo23972397, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.707 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo23862386-ReshapeInfo23972397 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.769 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo24062406, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.250.788 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo23862386-ReshapeInfo24062406 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.252.538 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23142314's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.252.568 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo23152315-ReshapeInfo23142314 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.252.836 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22752275's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.252.852 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo22762276-ReshapeInfo22752275 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.252.984 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo22752275's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.253.007 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo22762276-ReshapeInfo22752275 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.253.701 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo24222422-RmsNormInfo24212421 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.254.013 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo24352435-RmsNormInfo24342434. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.254.035 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo24352435-RmsNormInfo24342434 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.256.455 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23612361's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.256.484 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo23622362-ReshapeInfo23612361 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.256.990 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo24822482-RmsNormInfo24812481 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.711 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo24222422-RmsNormInfo24212421. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.737 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo24222422-RmsNormInfo24212421 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.807 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo24322432, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.830 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo24332433-ReshapeInfo24322432 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.907 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo24442444, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.928 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo24332433-ReshapeInfo24442444 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.257.991 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo24532453, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.258.011 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo24332433-ReshapeInfo24532453 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.259.767 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23612361's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.259.796 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo23622362-ReshapeInfo23612361 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.260.065 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23222322's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.260.082 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo23232323-ReshapeInfo23222322 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.260.213 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23222322's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.260.237 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo23232323-ReshapeInfo23222322 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.260.882 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo24692469-RmsNormInfo24682468 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.261.188 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo24822482-RmsNormInfo24812481. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.261.210 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo24822482-RmsNormInfo24812481 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.263.665 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24082408's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.263.694 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo24092409-ReshapeInfo24082408 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.264.199 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo25292529-RmsNormInfo25282528 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.264.898 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo24692469-RmsNormInfo24682468. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.264.923 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo24692469-RmsNormInfo24682468 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.264.994 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo24792479, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.265.016 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo24802480-ReshapeInfo24792479 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.265.081 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo24912491, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.265.100 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo24802480-ReshapeInfo24912491 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.265.164 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25002500, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.265.196 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo24802480-ReshapeInfo25002500 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.266.986 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24082408's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.267.017 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo24092409-ReshapeInfo24082408 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.267.305 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23692369's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.267.322 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo23702370-ReshapeInfo23692369 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.267.452 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo23692369's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.267.466 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo23702370-ReshapeInfo23692369 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.268.113 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo25162516-RmsNormInfo25152515 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.268.418 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo25292529-RmsNormInfo25282528. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.268.451 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo25292529-RmsNormInfo25282528 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.270.927 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24552455's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.270.957 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo24562456-ReshapeInfo24552455 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.271.465 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo25762576-RmsNormInfo25752575 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.169 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo25162516-RmsNormInfo25152515. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.193 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo25162516-RmsNormInfo25152515 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.267 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25262526, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.289 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo25272527-ReshapeInfo25262526 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.353 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25382538, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.373 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo25272527-ReshapeInfo25382538 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.437 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25472547, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.272.457 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo25272527-ReshapeInfo25472547 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.274.259 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24552455's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.274.301 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo24562456-ReshapeInfo24552455 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.274.593 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24162416's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.274.610 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo24172417-ReshapeInfo24162416 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.274.740 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24162416's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.274.754 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo24172417-ReshapeInfo24162416 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.275.398 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo25632563-RmsNormInfo25622562 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.275.706 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo25762576-RmsNormInfo25752575. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.275.726 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo25762576-RmsNormInfo25752575 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.278.178 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25022502's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.278.219 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo25032503-ReshapeInfo25022502 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.278.723 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo26232623-RmsNormInfo26222622 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.419 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo25632563-RmsNormInfo25622562. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.445 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo25632563-RmsNormInfo25622562 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.517 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25732573, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.540 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo25742574-ReshapeInfo25732573 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.606 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25852585, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.626 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo25742574-ReshapeInfo25852585 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.690 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo25942594, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.279.709 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo25742574-ReshapeInfo25942594 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.281.449 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25022502's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.281.488 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo25032503-ReshapeInfo25022502 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.281.826 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24632463's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.281.844 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo24642464-ReshapeInfo24632463 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.281.975 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo24632463's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.281.988 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo24642464-ReshapeInfo24632463 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.282.637 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo26102610-RmsNormInfo26092609 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.282.939 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo26232623-RmsNormInfo26222622. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.282.961 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo26232623-RmsNormInfo26222622 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.285.392 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25492549's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.285.421 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo25502550-ReshapeInfo25492549 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.285.974 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo26702670-RmsNormInfo26692669 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.678 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo26102610-RmsNormInfo26092609. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.703 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo26102610-RmsNormInfo26092609 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.774 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo26202620, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.796 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo26212621-ReshapeInfo26202620 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.862 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo26322632, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.881 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo26212621-ReshapeInfo26322632 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.946 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo26412641, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.286.965 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo26212621-ReshapeInfo26412641 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.288.708 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25492549's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.288.735 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo25502550-ReshapeInfo25492549 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.289.006 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25102510's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.289.034 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo25112511-ReshapeInfo25102510 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.289.165 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25102510's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.289.179 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo25112511-ReshapeInfo25102510 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.289.890 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo26572657-RmsNormInfo26562656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.290.196 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo26702670-RmsNormInfo26692669. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.290.217 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo26702670-RmsNormInfo26692669 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.292.659 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25962596's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.292.688 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo25972597-ReshapeInfo25962596 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.293.201 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo27172717-RmsNormInfo27162716 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.293.948 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo26572657-RmsNormInfo26562656. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.293.986 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo26572657-RmsNormInfo26562656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.294.061 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo26672667, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.294.084 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo26682668-ReshapeInfo26672667 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.294.149 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo26792679, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.294.169 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo26682668-ReshapeInfo26792679 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.294.233 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo26882688, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.294.253 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo26682668-ReshapeInfo26882688 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.296.009 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25962596's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.296.038 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo25972597-ReshapeInfo25962596 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.296.307 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25572557's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.296.323 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo25582558-ReshapeInfo25572557 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.296.466 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo25572557's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.296.479 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo25582558-ReshapeInfo25572557 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.297.139 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo27042704-RmsNormInfo27032703 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.297.448 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo27172717-RmsNormInfo27162716. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.297.470 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo27172717-RmsNormInfo27162716 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.299.935 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26432643's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.299.966 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo26442644-ReshapeInfo26432643 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.300.482 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo27642764-RmsNormInfo27632763 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.177 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo27042704-RmsNormInfo27032703. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.202 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo27042704-RmsNormInfo27032703 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.274 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo27142714, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.295 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo27152715-ReshapeInfo27142714 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.372 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo27262726, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.392 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo27152715-ReshapeInfo27262726 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.456 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo27352735, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.301.475 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo27152715-ReshapeInfo27352735 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.303.228 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26432643's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.303.258 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo26442644-ReshapeInfo26432643 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.303.525 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26042604's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.303.540 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo26052605-ReshapeInfo26042604 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.303.671 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26042604's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.303.694 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo26052605-ReshapeInfo26042604 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.304.348 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo27512751-RmsNormInfo27502750 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.304.652 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo27642764-RmsNormInfo27632763. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.304.674 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo27642764-RmsNormInfo27632763 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.307.139 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26902690's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.307.169 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo26912691-ReshapeInfo26902690 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.307.675 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo28112811-RmsNormInfo28102810 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.376 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo27512751-RmsNormInfo27502750. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.402 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo27512751-RmsNormInfo27502750 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.475 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo27612761, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.497 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo27622762-ReshapeInfo27612761 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.562 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo27732773, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.582 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo27622762-ReshapeInfo27732773 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.658 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo27822782, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.308.677 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo27622762-ReshapeInfo27822782 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.310.471 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26902690's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.310.502 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo26912691-ReshapeInfo26902690 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.310.773 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26512651's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.310.789 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo26522652-ReshapeInfo26512651 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.310.920 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26512651's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.310.933 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo26522652-ReshapeInfo26512651 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.311.593 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo27982798-RmsNormInfo27972797 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.311.906 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo28112811-RmsNormInfo28102810. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.311.927 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo28112811-RmsNormInfo28102810 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.314.384 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27372737's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.314.412 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo27382738-ReshapeInfo27372737 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.314.917 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo28582858-RmsNormInfo28572857 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.618 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo27982798-RmsNormInfo27972797. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.644 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo27982798-RmsNormInfo27972797 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.714 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo28082808, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.736 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo28092809-ReshapeInfo28082808 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.802 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo28202820, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.821 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo28092809-ReshapeInfo28202820 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.886 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo28292829, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.315.905 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo28092809-ReshapeInfo28292829 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.317.704 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27372737's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.317.733 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo27382738-ReshapeInfo27372737 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.318.007 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26982698's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.318.024 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo26992699-ReshapeInfo26982698 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.318.157 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo26982698's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.318.171 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo26992699-ReshapeInfo26982698 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.318.828 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo28452845-RmsNormInfo28442844 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.319.134 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo28582858-RmsNormInfo28572857. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.319.156 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo28582858-RmsNormInfo28572857 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.321.595 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27842784's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.321.634 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo27852785-ReshapeInfo27842784 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.322.179 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo29052905-RmsNormInfo29042904 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.322.878 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo28452845-RmsNormInfo28442844. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.322.903 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo28452845-RmsNormInfo28442844 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.322.975 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo28552855, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.322.998 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo28562856-ReshapeInfo28552855 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.323.061 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo28672867, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.323.081 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo28562856-ReshapeInfo28672867 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.323.145 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo28762876, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.323.164 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo28562856-ReshapeInfo28762876 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.324.907 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27842784's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.324.951 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo27852785-ReshapeInfo27842784 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.325.221 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27452745's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.325.237 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo27462746-ReshapeInfo27452745 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.325.368 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27452745's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.325.381 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo27462746-ReshapeInfo27452745 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.326.079 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo28922892-RmsNormInfo28912891 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.326.389 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo29052905-RmsNormInfo29042904. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.326.411 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo29052905-RmsNormInfo29042904 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.328.859 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28312831's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.328.898 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo28322832-ReshapeInfo28312831 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.329.406 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo29522952-RmsNormInfo29512951 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.133 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo28922892-RmsNormInfo28912891. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.161 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo28922892-RmsNormInfo28912891 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.232 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29022902, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.254 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29032903-ReshapeInfo29022902 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.321 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29142914, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.341 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29032903-ReshapeInfo29142914 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.402 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29232923, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.330.421 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29032903-ReshapeInfo29232923 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.332.191 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28312831's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.332.220 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo28322832-ReshapeInfo28312831 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.332.494 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27922792's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.332.522 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo27932793-ReshapeInfo27922792 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.332.655 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo27922792's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.332.669 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo27932793-ReshapeInfo27922792 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.333.319 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo29392939-RmsNormInfo29382938 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.333.624 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo29522952-RmsNormInfo29512951. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.333.703 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo29522952-RmsNormInfo29512951 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.336.133 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28782878's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.336.163 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo28792879-ReshapeInfo28782878 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.336.674 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo29992999-RmsNormInfo29982998 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.383 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo29392939-RmsNormInfo29382938. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.408 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo29392939-RmsNormInfo29382938 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.479 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29492949, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.501 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29502950-ReshapeInfo29492949 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.565 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29612961, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.585 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29502950-ReshapeInfo29612961 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.693 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29702970, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.337.715 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29502950-ReshapeInfo29702970 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.339.474 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28782878's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.339.503 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo28792879-ReshapeInfo28782878 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.339.777 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28392839's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.339.793 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo28402840-ReshapeInfo28392839 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.339.935 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28392839's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.339.949 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo28402840-ReshapeInfo28392839 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.340.607 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo29862986-RmsNormInfo29852985 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.340.912 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo29992999-RmsNormInfo29982998. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.340.934 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo29992999-RmsNormInfo29982998 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.343.402 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29252925's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.343.434 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo29262926-ReshapeInfo29252925 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.343.953 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo30463046-RmsNormInfo30453045 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.659 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo29862986-RmsNormInfo29852985. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.685 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo29862986-RmsNormInfo29852985 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.756 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo29962996, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.789 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29972997-ReshapeInfo29962996 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.855 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo30083008, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.875 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29972997-ReshapeInfo30083008 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.940 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo30173017, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.344.959 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo29972997-ReshapeInfo30173017 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.346.773 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29252925's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.346.805 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo29262926-ReshapeInfo29252925 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.347.077 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28862886's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.347.095 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo28872887-ReshapeInfo28862886 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.347.226 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo28862886's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.347.249 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo28872887-ReshapeInfo28862886 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.347.906 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo30333033-RmsNormInfo30323032 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.348.219 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo30463046-RmsNormInfo30453045. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.348.240 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo30463046-RmsNormInfo30453045 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.350.723 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29722972's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.350.754 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo29732973-ReshapeInfo29722972 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.351.263 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo30933093-RmsNormInfo30923092 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.351.972 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo30333033-RmsNormInfo30323032. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.351.998 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo30333033-RmsNormInfo30323032 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.352.071 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo30433043, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.352.096 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo30443044-ReshapeInfo30433043 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.352.164 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo30553055, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.352.196 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo30443044-ReshapeInfo30553055 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.352.261 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo30643064, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.352.281 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo30443044-ReshapeInfo30643064 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.354.072 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29722972's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.354.107 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo29732973-ReshapeInfo29722972 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.354.399 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29332933's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.354.416 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo29342934-ReshapeInfo29332933 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.354.548 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29332933's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.354.562 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo29342934-ReshapeInfo29332933 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.355.225 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo30803080-RmsNormInfo30793079 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.355.534 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo30933093-RmsNormInfo30923092. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.355.556 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo30933093-RmsNormInfo30923092 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.358.027 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30193019's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.358.058 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo30203020-ReshapeInfo30193019 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.358.575 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo31403140-RmsNormInfo31393139 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.281 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo30803080-RmsNormInfo30793079. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.306 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo30803080-RmsNormInfo30793079 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.378 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo30903090, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.400 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo30913091-ReshapeInfo30903090 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.464 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31023102, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.484 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo30913091-ReshapeInfo31023102 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.549 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31113111, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.359.569 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo30913091-ReshapeInfo31113111 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.361.320 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30193019's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.361.348 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo30203020-ReshapeInfo30193019 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.361.641 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29802980's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.361.695 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo29812981-ReshapeInfo29802980 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.361.828 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo29802980's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.361.841 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo29812981-ReshapeInfo29802980 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.362.501 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo31273127-RmsNormInfo31263126 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.362.805 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo31403140-RmsNormInfo31393139. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.362.826 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo31403140-RmsNormInfo31393139 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.365.242 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30663066's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.365.271 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo30673067-ReshapeInfo30663066 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.365.800 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo31873187-RmsNormInfo31863186 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.515 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo31273127-RmsNormInfo31263126. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.540 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo31273127-RmsNormInfo31263126 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.611 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31373137, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.633 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo31383138-ReshapeInfo31373137 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.701 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31493149, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.721 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo31383138-ReshapeInfo31493149 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.783 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31583158, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.366.803 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo31383138-ReshapeInfo31583158 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.368.531 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30663066's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.368.570 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo30673067-ReshapeInfo30663066 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.368.862 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30273027's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.368.880 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo30283028-ReshapeInfo30273027 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.369.011 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30273027's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.369.025 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo30283028-ReshapeInfo30273027 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.369.718 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo31743174-RmsNormInfo31733173 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.370.034 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo31873187-RmsNormInfo31863186. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.370.055 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo31873187-RmsNormInfo31863186 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.372.477 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31133113's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.372.517 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo31143114-ReshapeInfo31133113 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.027 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo32343234-RmsNormInfo32333233 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.780 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo31743174-RmsNormInfo31733173. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.806 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo31743174-RmsNormInfo31733173 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.877 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31843184, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.899 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo31853185-ReshapeInfo31843184 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.965 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo31963196, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.373.985 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo31853185-ReshapeInfo31963196 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.374.047 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32053205, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.374.067 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo31853185-ReshapeInfo32053205 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.375.809 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31133113's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.375.838 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo31143114-ReshapeInfo31133113 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.376.109 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30743074's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.376.135 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo30753075-ReshapeInfo30743074 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.376.268 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo30743074's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.376.281 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo30753075-ReshapeInfo30743074 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.376.939 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo32213221-RmsNormInfo32203220 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.377.240 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo32343234-RmsNormInfo32333233. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.377.261 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo32343234-RmsNormInfo32333233 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.379.738 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31603160's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.379.768 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo31613161-ReshapeInfo31603160 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.380.280 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo32813281-RmsNormInfo32803280 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.380.982 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo32213221-RmsNormInfo32203220. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.008 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo32213221-RmsNormInfo32203220 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.081 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32313231, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.104 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo32323232-ReshapeInfo32313231 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.170 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32433243, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.189 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo32323232-ReshapeInfo32433243 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.252 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32523252, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.381.271 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo32323232-ReshapeInfo32523252 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.383.051 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31603160's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.383.082 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo31613161-ReshapeInfo31603160 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.383.348 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31213121's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.383.377 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo31223122-ReshapeInfo31213121 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.383.509 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31213121's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.383.523 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo31223122-ReshapeInfo31213121 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.384.181 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo32683268-RmsNormInfo32673267 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.384.485 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo32813281-RmsNormInfo32803280. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.384.506 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo32813281-RmsNormInfo32803280 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.387.001 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32073207's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.387.033 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo32083208-ReshapeInfo32073207 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.387.543 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo33283328-RmsNormInfo33273327 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.240 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo32683268-RmsNormInfo32673267. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.265 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo32683268-RmsNormInfo32673267 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.349 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32783278, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.372 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo32793279-ReshapeInfo32783278 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.436 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32903290, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.456 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo32793279-ReshapeInfo32903290 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.520 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo32993299, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.388.539 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo32793279-ReshapeInfo32993299 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.390.328 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32073207's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.390.357 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo32083208-ReshapeInfo32073207 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.390.627 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31683168's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.390.644 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo31693169-ReshapeInfo31683168 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.390.774 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo31683168's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.390.797 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo31693169-ReshapeInfo31683168 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.391.450 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo33153315-RmsNormInfo33143314 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.391.756 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo33283328-RmsNormInfo33273327. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.391.778 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo33283328-RmsNormInfo33273327 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.394.236 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32543254's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.394.266 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo32553255-ReshapeInfo32543254 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.394.772 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo33753375-RmsNormInfo33743374 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.470 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo33153315-RmsNormInfo33143314. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.496 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo33153315-RmsNormInfo33143314 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.568 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo33253325, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.590 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo33263326-ReshapeInfo33253325 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.655 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo33373337, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.685 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo33263326-ReshapeInfo33373337 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.750 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo33463346, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.395.771 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo33263326-ReshapeInfo33463346 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.397.515 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32543254's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.397.544 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo32553255-ReshapeInfo32543254 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.397.826 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32153215's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.397.844 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo32163216-ReshapeInfo32153215 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.397.975 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32153215's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.397.989 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo32163216-ReshapeInfo32153215 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.398.654 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo33623362-RmsNormInfo33613361 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.398.961 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo33753375-RmsNormInfo33743374. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.398.983 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo33753375-RmsNormInfo33743374 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.401.410 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33013301's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.401.438 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo33023302-ReshapeInfo33013301 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.401.986 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo34223422-RmsNormInfo34213421 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.686 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo33623362-RmsNormInfo33613361. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.711 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo33623362-RmsNormInfo33613361 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.783 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo33723372, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.806 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo33733373-ReshapeInfo33723372 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.871 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo33843384, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.891 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo33733373-ReshapeInfo33843384 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.954 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo33933393, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.402.986 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo33733373-ReshapeInfo33933393 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.404.734 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33013301's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.404.763 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo33023302-ReshapeInfo33013301 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.405.031 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32623262's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.405.048 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo32633263-ReshapeInfo32623262 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.405.178 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo32623262's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.405.191 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo32633263-ReshapeInfo32623262 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.405.891 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo34093409-RmsNormInfo34083408 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.406.200 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo34223422-RmsNormInfo34213421. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.406.233 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo34223422-RmsNormInfo34213421 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.408.673 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33483348's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.408.701 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo33493349-ReshapeInfo33483348 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.409.213 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo34693469-RmsNormInfo34683468 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.409.954 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo34093409-RmsNormInfo34083408. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.409.982 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo34093409-RmsNormInfo34083408 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.410.054 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo34193419, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.410.078 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo34203420-ReshapeInfo34193419 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.410.143 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo34313431, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.410.163 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo34203420-ReshapeInfo34313431 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.410.228 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo34403440, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.410.247 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo34203420-ReshapeInfo34403440 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.411.999 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33483348's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.412.040 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo33493349-ReshapeInfo33483348 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.412.312 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33093309's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.412.329 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo33103310-ReshapeInfo33093309 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.412.460 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33093309's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.412.475 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo33103310-ReshapeInfo33093309 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.413.131 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo34563456-RmsNormInfo34553455 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.413.436 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo34693469-RmsNormInfo34683468. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.413.457 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo34693469-RmsNormInfo34683468 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.415.939 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33953395's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.415.982 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo33963396-ReshapeInfo33953395 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.416.499 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo35163516-RmsNormInfo35153515 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.199 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo34563456-RmsNormInfo34553455. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.225 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo34563456-RmsNormInfo34553455 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.298 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo34663466, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.320 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo34673467-ReshapeInfo34663466 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.387 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo34783478, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.407 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo34673467-ReshapeInfo34783478 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.471 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo34873487, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.417.491 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo34673467-ReshapeInfo34873487 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.419.260 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33953395's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.419.292 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo33963396-ReshapeInfo33953395 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.419.570 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33563356's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.419.589 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo33573357-ReshapeInfo33563356 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.419.721 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo33563356's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.419.734 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo33573357-ReshapeInfo33563356 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.420.385 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo35033503-RmsNormInfo35023502 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.420.695 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo35163516-RmsNormInfo35153515. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.420.716 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo35163516-RmsNormInfo35153515 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.423.209 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34423442's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.423.240 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo34433443-ReshapeInfo34423442 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.423.761 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo35633563-RmsNormInfo35623562 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.446 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo35033503-RmsNormInfo35023502. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.472 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo35033503-RmsNormInfo35023502 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.544 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo35133513, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.566 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo35143514-ReshapeInfo35133513 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.631 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo35253525, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.651 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo35143514-ReshapeInfo35253525 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.716 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo35343534, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.424.735 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo35143514-ReshapeInfo35343534 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.426.536 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34423442's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.426.566 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo34433443-ReshapeInfo34423442 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.426.837 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34033403's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.426.863 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo34043404-ReshapeInfo34033403 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.426.995 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34033403's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.427.009 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo34043404-ReshapeInfo34033403 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.427.655 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo35503550-RmsNormInfo35493549 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.427.964 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo35633563-RmsNormInfo35623562. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.427.986 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo35633563-RmsNormInfo35623562 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.430.467 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34893489's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.430.498 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo34903490-ReshapeInfo34893489 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.013 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo36103610-RmsNormInfo36093609 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.711 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo35503550-RmsNormInfo35493549. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.737 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo35503550-RmsNormInfo35493549 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.820 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo35603560, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.844 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo35613561-ReshapeInfo35603560 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.908 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo35723572, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.928 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo35613561-ReshapeInfo35723572 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.431.991 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo35813581, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.432.011 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo35613561-ReshapeInfo35813581 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.433.796 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34893489's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.433.825 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo34903490-ReshapeInfo34893489 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.434.095 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34503450's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.434.112 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo34513451-ReshapeInfo34503450 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.434.243 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34503450's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.434.267 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo34513451-ReshapeInfo34503450 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.434.916 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo35973597-RmsNormInfo35963596 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.435.223 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo36103610-RmsNormInfo36093609. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.435.244 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo36103610-RmsNormInfo36093609 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.437.693 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35363536's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.437.728 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo35373537-ReshapeInfo35363536 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.438.237 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo36573657-RmsNormInfo36563656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.438.941 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo35973597-RmsNormInfo35963596. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.438.966 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo35973597-RmsNormInfo35963596 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.439.038 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo36073607, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.439.061 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo36083608-ReshapeInfo36073607 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.439.140 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo36193619, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.439.161 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo36083608-ReshapeInfo36193619 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.439.224 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo36283628, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.439.243 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo36083608-ReshapeInfo36283628 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.441.007 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35363536's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.441.036 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo35373537-ReshapeInfo35363536 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.441.308 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34973497's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.441.324 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo34983498-ReshapeInfo34973497 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.441.455 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo34973497's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.441.478 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo34983498-ReshapeInfo34973497 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.442.150 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo36443644-RmsNormInfo36433643 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.442.463 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo36573657-RmsNormInfo36563656. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.442.486 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo36573657-RmsNormInfo36563656 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.444.907 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35833583's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.444.936 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo35843584-ReshapeInfo35833583 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.445.452 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo37043704-RmsNormInfo37033703 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.201 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo36443644-RmsNormInfo36433643. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.228 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo36443644-RmsNormInfo36433643 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.300 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo36543654, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.322 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo36553655-ReshapeInfo36543654 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.388 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo36663666, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.408 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo36553655-ReshapeInfo36663666 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.474 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo36753675, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.446.505 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo36553655-ReshapeInfo36753675 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.448.246 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35833583's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.448.276 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo35843584-ReshapeInfo35833583 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.448.567 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35443544's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.448.585 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo35453545-ReshapeInfo35443544 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.448.717 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35443544's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.448.730 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo35453545-ReshapeInfo35443544 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.449.385 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo36913691-RmsNormInfo36903690 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.449.709 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo37043704-RmsNormInfo37033703. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.449.744 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo37043704-RmsNormInfo37033703 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.452.165 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36303630's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.452.193 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo36313631-ReshapeInfo36303630 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.452.700 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo37513751-RmsNormInfo37503750 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.406 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo36913691-RmsNormInfo36903690. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.431 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo36913691-RmsNormInfo36903690 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.503 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo37013701, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.525 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo37023702-ReshapeInfo37013701 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.593 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo37133713, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.613 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo37023702-ReshapeInfo37133713 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.716 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo37223722, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.453.738 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo37023702-ReshapeInfo37223722 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.455.478 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36303630's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.455.517 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo36313631-ReshapeInfo36303630 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.455.808 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35913591's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.455.825 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo35923592-ReshapeInfo35913591 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.455.956 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo35913591's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.455.969 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo35923592-ReshapeInfo35913591 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.456.522 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo37383738-RmsNormInfo37373737 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.456.830 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo37513751-RmsNormInfo37503750. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.456.850 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo37513751-RmsNormInfo37503750 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.459.314 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36773677's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.459.353 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo36783678-ReshapeInfo36773677 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.459.563 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:436] GetNextOpStrategyByPrevOpStrategyWithMiniComm] Inconsistency occurred at edge: CastInfo37913791-RmsNormInfo37903790 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.270 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo37383738-RmsNormInfo37373737. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.294 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo37383738-RmsNormInfo37373737 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.366 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo37483748, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.388 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo37493749-ReshapeInfo37483748 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.453 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo37603760, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.473 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo37493749-ReshapeInfo37603760 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.538 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:913] GetSWCIndexByInputLayoutWithZeroComm] There is no available strategy for zero communication cost for reshape: ReshapeInfo37693769, which may cause error. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.460.557 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:559] GetReshapeSWCIndexByPrevOpStrategy] Inconsistency occurred at edge: CastInfo37493749-ReshapeInfo37693769 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.462.329 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36773677's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.462.373 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo36783678-ReshapeInfo36773677 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.462.648 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36383638's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.462.666 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo36393639-ReshapeInfo36383638 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.462.801 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36383638's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.462.814 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo36393639-ReshapeInfo36383638 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.463.000 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: CastInfo37913791-RmsNormInfo37903790. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.463.018 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: CastInfo37913791-RmsNormInfo37903790 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.465.485 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo37243724's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.465.514 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo37253725-ReshapeInfo37243724 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.467.752 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo37243724's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.467.783 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo37253725-ReshapeInfo37243724 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.468.051 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36853685's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.468.068 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo36863686-ReshapeInfo36853685 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.468.199 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo36853685's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.468.212 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo36863686-ReshapeInfo36853685 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.469.914 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo37713771's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.469.956 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo37723772-ReshapeInfo37713771 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.470.259 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: MatMulInfo37873787-CastInfo37863786. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.470.280 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:172] CheckConfiguredSuccEdgeConsistency] Inconsistency occurred at edge: MatMulInfo37873787-CastInfo37863786 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.470.751 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo37713771's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 1 ] tensor map origin = [ 2 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.470.772 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo37723772-ReshapeInfo37713771 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.471.038 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo37323732's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.471.055 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo37333733-ReshapeInfo37323732 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.471.185 [mindspore/ccsrc/frontend/parallel/ops_info/reshape_info.cc:972] CheckStrategyConsistencyByInputLayout] ReshapeInfo37323732's desired input layout is: device arrangement = [ 4 ] tensor map = [ 0 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 4 1 ] tensor map origin = [ 1 0 ] tensor shape origin = [ 4096 22016 ], while the selected input layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 ] tensor shape = [ 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ 2 1 ] tensor shape origin = [ 4096 22016 ] and the output layout is: device arrangement = [ 4 ] tensor map = [ -1 -1 -1 ] tensor shape = [ 1 4096 22016 ] device arrangement origin = [ 1 1 4 ] tensor map origin = [ -1 -1 -1 ] tensor shape origin = [ 1 4096 22016 ] [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.471.198 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:141] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SiLUInfo37333733-ReshapeInfo37323732 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.654.837 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: MatMulInfo37873787-CastInfo37863786. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.654.933 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: MatMulInfo37873787-CastInfo37863786 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.812.253 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: SubInfo37933793-MulInfo38013801. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.812.326 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:195] CheckConfiguredPrevEdgeConsistency] Inconsistency occurred at edge: SubInfo37933793-MulInfo38013801 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.812.429 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: OneHotInfo37923792-MulInfo38013801. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.812.445 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:195] CheckConfiguredPrevEdgeConsistency] Inconsistency occurred at edge: OneHotInfo37923792-MulInfo38013801 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.812.482 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: OneHotInfo37923792-MulInfo38013801. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.812.495 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: OneHotInfo37923792-MulInfo38013801 [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.817.313 [mindspore/ccsrc/frontend/parallel/auto_parallel/edge_costmodel.cc:624] CheckStrategyConsistency] There are redistribution cost occurs at edge: SubInfo37933793-MulInfo38013801. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.817.350 [mindspore/ccsrc/frontend/parallel/auto_parallel/graph_costmodel.cc:147] CheckVisitedEdgeConsistency] Inconsistency occurred at edge: SubInfo37933793-MulInfo38013801 [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.825.247 [mindspore/ccsrc/distributed/collective/collective_manager.cc:329] CreateCommunicationGroup] Start to create communication group: 4-6301172352641561019 [const vector]{0, 1, 2, 3}, async: 0, submit_now: 0 [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.825.320 [mindspore/ccsrc/distributed/collective/collective_manager.cc:661] CreateSimulationGroup] Create dummy communication group with group name: 4-6301172352641561019, group ranks: [const vector]{0, 1, 2, 3}. Real group size: 1. [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.825.994 [mindspore/ccsrc/distributed/collective/collective_manager.cc:329] CreateCommunicationGroup] Start to create communication group: 1-2297668033614959926 [const vector]{0}, async: 0, submit_now: 0 [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:13:45.826.025 [mindspore/ccsrc/distributed/collective/collective_manager.cc:661] CreateSimulationGroup] Create dummy communication group with group name: 1-2297668033614959926, group ranks: [const vector]{0}. Real group size: 1. [PROF]parallel_strategy_search costs 5837.27 msec. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:46.163.860 [mindspore/ccsrc/frontend/parallel/pass/dataset_reader_optimizer.cc:305] BroadcastDataset] For now on, only dataset sink mode support dataset reader optimizer. [WARNING] PARALLEL(15286,ffff9e0d9c10,python):2024-12-27-13:13:53.756.965 [mindspore/ccsrc/frontend/parallel/parallel_processer.cc:112] GetSensLossPairs] Can not find the loss cnode [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:13:54.264.253 [mindspore/ccsrc/distributed/collective/collective_manager.cc:329] CreateCommunicationGroup] Start to create communication group: 8-1065305096478261272 [const vector]{0, 4, 8, 12, 16, 20, 24, 28}, async: 0, submit_now: 0 [WARNING] DISTRIBUTED(15286,ffff9e0d9c10,python):2024-12-27-13:13:54.264.361 [mindspore/ccsrc/distributed/collective/collective_manager.cc:661] CreateSimulationGroup] Create dummy communication group with group name: 8-1065305096478261272, group ranks: [const vector]{0, 4, 8, 12, 16, 20, 24, 28}. Real group size: 1. [PROF]InitCommGroup costs 15.274 msec. [PROF]WaitAllCommInit costs 0.005 msec. graph_kernel_flags = "" [PROF]ConstructKernelGraph costs 2684.93 msec. [PROF]EliminateIllegalDataTypePass costs 270.89 msec. [PROF]CommonUnifyMindIR costs 264.378 msec. [PROF]BackendCommonOptimization costs 2013.55 msec. [PROF]OptimizationWithoutBackend costs 2549.3 msec. [PROF]GEUnifyMindIR costs 5491.32 msec. [PROF]GEBackendOptimizeACL costs 976.168 msec. [PROF]GEBackendOptimizeACL costs 50.586 msec. [PROF]GEBackendOptimizeACL costs 61.784 msec. [PROF]OptimizeACLGraph costs 88.563 msec. [PROF]OptimizeACLGraph costs 163.384 msec. [PROF]GEBackendOptimizeACL costs 1.163 msec. [PROF]GEBackendOptimizeACL costs 0.371 msec. [PROF]OptimizeACLGraph costs 0.63 msec. [PROF]GEBackendOptimizeACL costs 0.744 msec. [PROF]OptimizeACLGraph costs 1.138 msec. [PROF]OptimizeACLGraph costs 4.011 msec. [PROF]GEBackendOptimizeACL costs 0.354 msec. [PROF]OptimizeACLGraph costs 0.613 msec. [PROF]GEBackendOptimizeACL costs 0.246 msec. [PROF]OptimizeACLGraph costs 0.433 msec. [PROF]GEBackendOptimizeACL costs 223.712 msec. [PROF]GEBackendOptimizeACL costs 0.592 msec. [PROF]OptimizeACLGraph costs 4.656 msec. [PROF]GEBackendOptimizeACL costs 1.112 msec. [PROF]GEBackendOptimizeACL costs 0.348 msec. [PROF]OptimizeACLGraph costs 0.635 msec. [PROF]GEBackendOptimizeACL costs 0.664 msec. [PROF]OptimizeACLGraph costs 1.041 msec. [PROF]OptimizeACLGraph costs 3.523 msec. [PROF]OptimizeACLGraph costs 383.907 msec. [PROF]OptimizeACLGraph costs 1973.31 msec. [PROF]SelectKernel costs 79.436 msec. [PROF]SelectKernel costs 142.929 msec. [PROF]SelectKernel costs 0.246 msec. [PROF]SelectKernel costs 0.7 msec. [PROF]SelectKernel costs 1.383 msec. [PROF]SelectKernel costs 0.217 msec. [PROF]SelectKernel costs 0.157 msec. [PROF]SelectKernel costs 0.288 msec. [PROF]SelectKernel costs 0.245 msec. [PROF]SelectKernel costs 0.68 msec. [PROF]SelectKernel costs 1.328 msec. [PROF]SelectKernel costs 120.175 msec. [PROF]SelectKernel costs 6527.22 msec. [PROF]GraphKernelOptimize costs 7495.11 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 757.571 msec. [PROF]GraphKernelOptimize costs 317.969 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 72.178 msec. [PROF]GraphKernelOptimize costs 490.12 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 62.132 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 556.21 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 949.793 msec. [PROF]GraphKernelOptimize costs 4.275 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.913 msec. [PROF]GraphKernelOptimize costs 5.913 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.655 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 6.69 msec. [PROF]GraphKernelOptimize costs 5.902 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.728 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 6.759 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 18.83 msec. [PROF]GraphKernelOptimize costs 3.409 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.643 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 4.171 msec. [PROF]GraphKernelOptimize costs 2.314 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.302 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 2.722 msec. [PROF]GraphKernelOptimize costs 1572.14 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 155.739 msec. [PROF]GraphKernelOptimize costs 4.19 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.563 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 4.897 msec. [PROF]GraphKernelOptimize costs 4.145 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.86 msec. [PROF]GraphKernelOptimize costs 5.619 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.688 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 6.421 msec. [PROF]GraphKernelOptimize costs 5.893 msec. [PROF]GEBackendOptimizeACLAfterKernelSelect costs 0.755 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 6.764 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 18.351 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 1756.76 msec. [PROF]OptimizeACLGraphAfterKernelSelect costs 11032.7 msec. [PROF]GEAfterInlineOptimize costs 1439.7 msec. [PROF]InlineCallGraph costs 4034.76 msec. [PROF]InlineSwitchGraph costs 13.567 msec. [PROF]InlineSwitchGraph costs 25.022 msec. [PROF]InlineSwitchGraph costs 0.099 msec. [PROF]InlineSwitchGraph costs 0.114 msec. [PROF]InlineSwitchGraph costs 1.827 msec. [PROF]InlineSwitchGraph costs 0.095 msec. [PROF]InlineSwitchGraph costs 0.062 msec. [PROF]InlineSwitchGraph costs 0.071 msec. [PROF]InlineSwitchGraph costs 0.071 msec. [PROF]InlineSwitchGraph costs 0.109 msec. [PROF]InlineSwitchGraph costs 1.373 msec. [PROF]InlineSwitchGraph costs 447.553 msec. [PROF]InlineSwitchGraph costs 2462.09 msec. [PROF]OptimizeGraph costs 26180.2 msec. [PROF]AclAfterCreateKernel costs 2543.39 msec. [PROF]OptimizeACLGraphAfterCreateKernel costs 2543.58 msec. [PROF]OptimizeExecutionOrder costs 1801.56 msec. [PROF]CreateKernel costs 8482.99 msec. The size of execution order: 44315 The size of all node: 192237 [PROF]PreprocessBeforeRun costs 2534.74 msec. [PROF]CreateDeviceAddress costs 3620.65 msec. [PROF]CompileSubGraph costs 54108.3 msec. [PROF]GraphSchedulerLinkNoSinkMode costs 1774.79 msec. [PROF]GraphSchedulerLink costs 3048.99 msec. [PROF]GraphScheduler costs 7080.75 msec. [PROF]compile_backend_graph costs 61495 msec. TotalTime = 180.243, [21] [bootstrap]: 0.0327266 [type_inference]: 32.7738 [auto_monad]: 1.04209 [graph_reusing]: 0.749147 [inline]: 29.0602, [2] [rewriter_before_opt_a]: 0.313532 [a1a2]: 28.7466, [3] [Cycle 1]: 20.3298, [11] [expand_dump_flag]: 0.00545982 [switch_simplify]: 0.146137 [loop_unroll]: 0.126186 [a_1]: 15.7135 [recompute_prepare]: 0.122335 [updatestate_depend_eliminate]: 0.298552 [updatestate_assign_eliminate]: 0.122473 [updatestate_loads_eliminate]: 0.660671 [parameter_eliminate]: 0.00710941 [a_2]: 3.03458 [parallel_inline_pass]: 0.0921507 [Cycle 2]: 4.15912, [11] [expand_dump_flag]: 6.558e-05 [switch_simplify]: 0.0908285 [loop_unroll]: 0.0892043 [a_1]: 2.2039 [recompute_prepare]: 0.088223 [updatestate_depend_eliminate]: 0.0697963 [updatestate_assign_eliminate]: 0.0674126 [updatestate_loads_eliminate]: 0.068936 [parameter_eliminate]: 0.0001768 [a_2]: 1.39392 [parallel_inline_pass]: 0.0860947 [Cycle 3]: 4.24958, [11] [expand_dump_flag]: 6.085e-05 [switch_simplify]: 0.0853152 [loop_unroll]: 0.0848335 [a_1]: 2.15869 [recompute_prepare]: 0.0871418 [updatestate_depend_eliminate]: 0.0725325 [updatestate_assign_eliminate]: 0.0735018 [updatestate_loads_eliminate]: 0.0752026 [parameter_eliminate]: 0.00019987 [a_2]: 1.52158 [parallel_inline_pass]: 0.0899382 [parallel-infer-symbol]: 0.144976 [pre_auto_parallel]: 6.35982 [insert-virtual-dataset]: 0.0384111 [parallel-infer-symbol-second]: 9.25999e-06 [dataset_repeat_opt]: 0.0755934 [pipeline_split]: 1.17989 [optimize]: 47.0297, [52] [py_interpret_to_execute]: 0.0335979 [rewriter_before_opt_a]: 0.039175 [opt_a]: 41.1806, [4] [Cycle 1]: 26.2341, [45] [expand_dump_flag]: 1.699e-05 [switch_simplify]: 0.0176597 [loop_unroll]: 0.0159713 [a_1]: 0.514965 [recompute_prepare]: 0.0176036 [updatestate_depend_eliminate]: 0.014538 [updatestate_assign_eliminate]: 0.0125527 [updatestate_loads_eliminate]: 0.590121 [parameter_eliminate]: 4.704e-05 [a_2]: 0.177562 [accelerated_algorithm]: 0.0314243 [shard]: 3.83999e-06 [meta_shard_fg_expand]: 0.0078894 [shard_inline]: 0.0113481 [auto_parallel]: 0.00847986 [parallel]: 5.62184 [flash_sp]: 0.00458503 [merge_comm]: 0.0182071 [allreduce_fusion]: 0.00977309 [matmul_add_comm_reduction]: 0.0170183 [allreduce_slice_to_reducescatter]: 1.12e-06 [virtual_shard_identity]: 0.0130815 [virtual_dataset]: 0.0125671 [get_grad_eliminate_]: 0.0124991 [virtual_output]: 0.0124159 [merge_forward]: 0.00927623 [cell_reuse_recompute_pass]: 0.00864868 [cell_reuse_handle_not_recompute_node_pass]: 0.0246908 [before_grad]: 0.0216153 [inplace_validation]: 0.00978651 [parallel_renormalize]: 2.41523 [cast_eliminate]: 0.027248 [meta_fg_expand]: 6.28059, [1] [partial_eliminate_before_grad]: 0.0153236, [1] [Cycle 1]: 0.0153021, [1] [partial_eliminate_]: 0.0152334 [inplace_validation_after_expand]: 0.109623 [flash_sp_send_recv_attached]: 0.0930212 [receive_attached]: 0.079731 [after_resolve]: 0.11192 [a_after_grad]: 0.196202 [special_op_eliminate]: 0.111802 [renormalize]: 6.64 [add_forward_monad_depend]: 0.0553813 [auto_monad_grad]: 0.14325 [auto_monad_eliminator]: 0.153651 [cse]: 0.364435 [a_3]: 2.23293 [Cycle 2]: 8.78171, [45] [expand_dump_flag]: 0.00357432 [switch_simplify]: 0.137498 [loop_unroll]: 0.11308 [a_1]: 3.0655 [recompute_prepare]: 0.0376287 [updatestate_depend_eliminate]: 0.0360654 [updatestate_assign_eliminate]: 0.0360719 [updatestate_loads_eliminate]: 0.0361477 [parameter_eliminate]: 0.00172256 [a_2]: 1.39625 [accelerated_algorithm]: 0.0430902 [shard]: 6.48998e-06 [meta_shard_fg_expand]: 0.0178739 [shard_inline]: 0.0272857 [auto_parallel]: 0.0234075 [parallel]: 0.164303 [flash_sp]: 1.154e-05 [merge_comm]: 0.022309 [allreduce_fusion]: 0.0216566 [matmul_add_comm_reduction]: 0.0298899 [allreduce_slice_to_reducescatter]: 1.55001e-06 [virtual_shard_identity]: 0.026798 [virtual_dataset]: 0.0263851 [get_grad_eliminate_]: 0.0257442 [virtual_output]: 0.0260109 [merge_forward]: 0.0210145 [cell_reuse_recompute_pass]: 0.0764341 [cell_reuse_handle_not_recompute_node_pass]: 0.0467244 [before_grad]: 0.0425518 [inplace_validation]: 0.0218743 [parallel_renormalize]: 2.50002e-07 [cast_eliminate]: 0.0293364 [meta_fg_expand]: 0.0504744 [inplace_validation_after_expand]: 0.0315352 [flash_sp_send_recv_attached]: 5.61998e-06 [receive_attached]: 2.37999e-06 [after_resolve]: 0.0261548 [a_after_grad]: 0.0390736 [special_op_eliminate]: 0.0262381 [renormalize]: 2.24645 [add_forward_monad_depend]: 0.00386589 [auto_monad_grad]: 0.0001029 [auto_monad_eliminator]: 0.0831276 [cse]: 0.499869 [a_3]: 0.21627 [Cycle 3]: 3.90832, [45] [expand_dump_flag]: 0.00013843 [switch_simplify]: 0.0276484 [loop_unroll]: 0.0279867 [a_1]: 0.612555 [recompute_prepare]: 0.0297004 [updatestate_depend_eliminate]: 0.0259157 [updatestate_assign_eliminate]: 0.0242678 [updatestate_loads_eliminate]: 0.0235675 [parameter_eliminate]: 0.00048776 [a_2]: 0.456471 [accelerated_algorithm]: 0.0289022 [shard]: 4.32e-06 [meta_shard_fg_expand]: 0.014198 [shard_inline]: 0.0270668 [auto_parallel]: 0.0239712 [parallel]: 1.862e-05 [flash_sp]: 4.91002e-06 [merge_comm]: 0.0233858 [allreduce_fusion]: 0.0226255 [matmul_add_comm_reduction]: 0.0345664 [allreduce_slice_to_reducescatter]: 1.54e-06 [virtual_shard_identity]: 0.0275347 [virtual_dataset]: 0.0268215 [get_grad_eliminate_]: 0.0268043 [virtual_output]: 0.0268101 [merge_forward]: 0.0226574 [cell_reuse_recompute_pass]: 0.0232633 [cell_reuse_handle_not_recompute_node_pass]: 0.0450902 [before_grad]: 0.0415703 [inplace_validation]: 0.0233307 [parallel_renormalize]: 3.19997e-07 [cast_eliminate]: 0.0291232 [meta_fg_expand]: 0.0420901 [inplace_validation_after_expand]: 0.0279768 [flash_sp_send_recv_attached]: 5.12e-06 [receive_attached]: 1.82001e-06 [after_resolve]: 0.0275264 [a_after_grad]: 0.0409425 [special_op_eliminate]: 0.0284864 [renormalize]: 1.51135 [add_forward_monad_depend]: 0.00101412 [auto_monad_grad]: 7.678e-05 [auto_monad_eliminator]: 0.0835306 [cse]: 0.24175 [a_3]: 0.205144 [Cycle 4]: 2.25636, [45] [expand_dump_flag]: 0.00012776 [switch_simplify]: 0.0283006 [loop_unroll]: 0.0276865 [a_1]: 0.597076 [recompute_prepare]: 0.0264085 [updatestate_depend_eliminate]: 0.0231687 [updatestate_assign_eliminate]: 0.0221336 [updatestate_loads_eliminate]: 0.0218008 [parameter_eliminate]: 0.00030524 [a_2]: 0.421986 [accelerated_algorithm]: 0.0267562 [shard]: 4.35e-06 [meta_shard_fg_expand]: 0.0132276 [shard_inline]: 0.0260126 [auto_parallel]: 0.0233531 [parallel]: 2.166e-05 [flash_sp]: 2.84999e-06 [merge_comm]: 0.0227757 [allreduce_fusion]: 0.0218762 [matmul_add_comm_reduction]: 0.0330309 [allreduce_slice_to_reducescatter]: 9.39996e-07 [virtual_shard_identity]: 0.0263089 [virtual_dataset]: 0.0252241 [get_grad_eliminate_]: 0.024729 [virtual_output]: 0.0249522 [merge_forward]: 0.021317 [cell_reuse_recompute_pass]: 0.0221442 [cell_reuse_handle_not_recompute_node_pass]: 0.0448437 [before_grad]: 0.0417948 [inplace_validation]: 0.0219515 [parallel_renormalize]: 2.00002e-07 [cast_eliminate]: 0.0293311 [meta_fg_expand]: 0.0346858 [inplace_validation_after_expand]: 0.0265106 [flash_sp_send_recv_attached]: 5.40999e-06 [receive_attached]: 1.89e-06 [after_resolve]: 0.0255873 [a_after_grad]: 0.037487 [special_op_eliminate]: 0.0250637 [renormalize]: 2.00002e-07 [add_forward_monad_depend]: 0.00012757 [auto_monad_grad]: 7.539e-05 [auto_monad_eliminator]: 0.0734348 [cse]: 0.221996 [a_3]: 0.191037 [py_interpret_to_execute_after_opt_a]: 0.0379649 [slice_cell_reuse_recomputed_activation]: 0.0225626 [rewriter_after_opt_a]: 0.234032 [convert_after_rewriter]: 0.0728897 [order_py_execute_after_rewriter]: 0.0631575 [opt_b]: 2.72039, [2] [Cycle 1]: 2.17306, [7] [b_1]: 1.90711 [b_2]: 0.0155254 [updatestate_depend_eliminate]: 0.0136739 [updatestate_assign_eliminate]: 0.0132206 [updatestate_loads_eliminate]: 0.0131589 [renormalize]: 1.44998e-06 [cse]: 0.210119 [Cycle 2]: 0.547296, [7] [b_1]: 0.37913 [b_2]: 0.0151161 [updatestate_depend_eliminate]: 0.0132517 [updatestate_assign_eliminate]: 0.0129739 [updatestate_loads_eliminate]: 0.0128709 [renormalize]: 2.09984e-07 [cse]: 0.113712 [optimize_parallel_all_gather_comm]: 0.019603 [overlap_param_gather]: 6.31998e-06 [cconv]: 0.0461759 [loop_unroll]: 0.0188643 [opt_after_cconv]: 0.302496, [2] [Cycle 1]: 0.152353, [7] [c_1]: 0.0727184 [parameter_eliminate]: 0.00015651 [updatestate_depend_eliminate]: 0.013638 [updatestate_assign_eliminate]: 0.0132791 [updatestate_loads_eliminate]: 0.0133148 [cse]: 0.0390181 [renormalize]: 1.42999e-06 [Cycle 2]: 0.150114, [7] [c_1]: 0.0711387 [parameter_eliminate]: 7.015e-05 [updatestate_depend_eliminate]: 0.0133986 [updatestate_assign_eliminate]: 0.0131515 [updatestate_loads_eliminate]: 0.0134624 [cse]: 0.0387007 [renormalize]: 3.7998e-07 [remove_dup_value]: 0.0677547 [tuple_transform]: 0.985025, [1] [Cycle 1]: 0.985004, [2] [d_1]: 0.124027 [renormalize]: 0.860845 [partial_unused_args_eliminate]: 0.771413 [add_cache_embedding]: 0.00656773 [add_recomputation]: 7.43e-06 [cse_after_recomputation]: 0.0897929, [1] [Cycle 1]: 0.0897697, [1] [cse]: 0.0897011 [environ_conv]: 0.0189786 [swap_dp_allreduce_reducescatter]: 0.0186563 [bias_add_comm_swap]: 4.95999e-06 [label_micro_interleaved_index]: 3.08e-06 [label_fine_grained_interleaved_index]: 0.0171546 [merge_cast_opt]: 3.63e-06 [slice_recompute_activation]: 2.68e-06 [micro_interleaved_order_control]: 2.34001e-06 [assign_add_opt]: 0.0623483 [ForceFp32Comm]: 3.71001e-06 [remove_cast_before_assign_add]: 0.0190363 [full_micro_interleaved_order_control]: 5.05001e-06 [reorder_send_recv_between_fp_bp]: 2.08998e-06 [comm_op_add_attrs]: 0.0167973 [add_comm_op_reuse_tag]: 0.0140212 [interleave_split_concat_branches]: 2.37001e-06 [interleave_parallel_branches]: 1.07e-06 [overlap_opt_shard_in_pipeline]: 1.111e-05 [overlap_opt_shard_grad_in_pipeline]: 0.0134203 [control_data_broadcast_order]: 2.36e-06 [grouped_pairwise_exchange_alltoall]: 2.399e-05 [offloading_packed_experts]: 2.67001e-06 [overlap_recompute_and_grad_model_parallel]: 2.43e-06 [overlap_grad_matmul_and_grad_allreduce]: 9.39996e-07 [overlap_recompute_allgather_and_fa_grad]: 3.15002e-06 [overlap_grad_ring_attention]: 0.0191157 [overlap_grad_flash_sp]: 0.0201049 [begin_end_overlap_inline]: 1.94999e-06 [split_matmul_comm_elemetwise]: 5.13002e-06 [split_layernorm_comm]: 2.74001e-06 [handle_group_info]: 2.409e-05 [symbol_engine_optimizer]: 0.0964459, [1] [Cycle 1]: 0.0964256, [6] [build]: 0.00835834 [elim_shapecalc]: 0.0168597 [elim_not_effective]: 0.0264783 [opt_reshape]: 0.0178722 [fold_const_symbol]: 0.0265463 [renormalize]: 1.51998e-06 [pipeline_parallel_scheduler]: 0.0259212 [auto_monad_reorder]: 0.0574399 [get_jit_bprop_graph]: 1.10001e-06 [rewriter_after_jit_bprop_graph]: 7.49977e-07 [eliminate_special_op_node]: 0.0555446 [distribtued_split]: 3.55e-06 [validate]: 0.0355247 [task_emit]: 61.5524 [execute]: 1.564e-05 Sums bootstrap : 0.032727s : 0.02% type_inference : 32.773794s : 18.84% auto_monad : 1.042088s : 0.60% graph_reusing : 0.749147s : 0.43% inline.rewriter_before_opt_a : 0.313532s : 0.18% inline.a1a2.expand_dump_flag : 0.005586s : 0.00% inline.a1a2.switch_simplify : 0.322281s : 0.19% inline.a1a2.loop_unroll : 0.300224s : 0.17% inline.a1a2.a_1 : 20.076118s : 11.54% inline.a1a2.recompute_prepare : 0.297700s : 0.17% inline.a1a2.updatestate_depend_eliminate : 0.440881s : 0.25% inline.a1a2.updatestate_assign_eliminate : 0.263387s : 0.15% inline.a1a2.updatestate_loads_eliminate : 0.804810s : 0.46% inline.a1a2.parameter_eliminate : 0.007486s : 0.00% inline.a1a2.a_2 : 5.950076s : 3.42% inline.a1a2.parallel_inline_pass : 0.268184s : 0.15% parallel-infer-symbol : 0.144976s : 0.08% pre_auto_parallel : 6.359822s : 3.66% insert-virtual-dataset : 0.038411s : 0.02% parallel-infer-symbol-second : 0.000009s : 0.00% dataset_repeat_opt : 0.075593s : 0.04% pipeline_split : 1.179893s : 0.68% optimize.py_interpret_to_execute : 0.033598s : 0.02% optimize.rewriter_before_opt_a : 0.039175s : 0.02% optimize.opt_a.expand_dump_flag : 0.003858s : 0.00% optimize.opt_a.switch_simplify : 0.211107s : 0.12% optimize.opt_a.loop_unroll : 0.184725s : 0.11% optimize.opt_a.a_1 : 4.790094s : 2.75% optimize.opt_a.recompute_prepare : 0.111341s : 0.06% optimize.opt_a.updatestate_depend_eliminate : 0.099688s : 0.06% optimize.opt_a.updatestate_assign_eliminate : 0.095026s : 0.05% optimize.opt_a.updatestate_loads_eliminate : 0.671637s : 0.39% optimize.opt_a.parameter_eliminate : 0.002563s : 0.00% optimize.opt_a.a_2 : 2.452269s : 1.41% optimize.opt_a.accelerated_algorithm : 0.130173s : 0.07% optimize.opt_a.shard : 0.000019s : 0.00% optimize.opt_a.meta_shard_fg_expand : 0.053189s : 0.03% optimize.opt_a.shard_inline : 0.091713s : 0.05% optimize.opt_a.auto_parallel : 0.079212s : 0.05% optimize.opt_a.parallel : 5.786186s : 3.33% optimize.opt_a.flash_sp : 0.004604s : 0.00% optimize.opt_a.merge_comm : 0.086678s : 0.05% optimize.opt_a.allreduce_fusion : 0.075931s : 0.04% optimize.opt_a.matmul_add_comm_reduction : 0.114506s : 0.07% optimize.opt_a.allreduce_slice_to_reducescatter : 0.000005s : 0.00% optimize.opt_a.virtual_shard_identity : 0.093723s : 0.05% optimize.opt_a.virtual_dataset : 0.090998s : 0.05% optimize.opt_a.get_grad_eliminate_ : 0.089777s : 0.05% optimize.opt_a.virtual_output : 0.090189s : 0.05% optimize.opt_a.merge_forward : 0.074265s : 0.04% optimize.opt_a.cell_reuse_recompute_pass : 0.130490s : 0.08% optimize.opt_a.cell_reuse_handle_not_recompute_node_pass : 0.161349s : 0.09% optimize.opt_a.before_grad : 0.147532s : 0.08% optimize.opt_a.inplace_validation : 0.076943s : 0.04% optimize.opt_a.parallel_renormalize : 2.415228s : 1.39% optimize.opt_a.cast_eliminate : 0.115039s : 0.07% optimize.opt_a.meta_fg_expand : 0.127250s : 0.07% optimize.opt_a.meta_fg_expand.partial_eliminate_before_grad.partial_eliminate_ : 0.015233s : 0.01% optimize.opt_a.inplace_validation_after_expand : 0.195646s : 0.11% optimize.opt_a.flash_sp_send_recv_attached : 0.093037s : 0.05% optimize.opt_a.receive_attached : 0.079737s : 0.05% optimize.opt_a.after_resolve : 0.191188s : 0.11% optimize.opt_a.a_after_grad : 0.313705s : 0.18% optimize.opt_a.special_op_eliminate : 0.191590s : 0.11% optimize.opt_a.renormalize : 10.397809s : 5.98% optimize.opt_a.add_forward_monad_depend : 0.060389s : 0.03% optimize.opt_a.auto_monad_grad : 0.143505s : 0.08% optimize.opt_a.auto_monad_eliminator : 0.393744s : 0.23% optimize.opt_a.cse : 1.328050s : 0.76% optimize.opt_a.a_3 : 2.845379s : 1.64% optimize.py_interpret_to_execute_after_opt_a : 0.037965s : 0.02% optimize.slice_cell_reuse_recomputed_activation : 0.022563s : 0.01% optimize.rewriter_after_opt_a : 0.234032s : 0.13% optimize.convert_after_rewriter : 0.072890s : 0.04% optimize.order_py_execute_after_rewriter : 0.063157s : 0.04% optimize.opt_b.b_1 : 2.286240s : 1.31% optimize.opt_b.b_2 : 0.030642s : 0.02% optimize.opt_b.updatestate_depend_eliminate : 0.026926s : 0.02% optimize.opt_b.updatestate_assign_eliminate : 0.026195s : 0.02% optimize.opt_b.updatestate_loads_eliminate : 0.026030s : 0.01% optimize.opt_b.renormalize : 0.000002s : 0.00% optimize.opt_b.cse : 0.323832s : 0.19% optimize.optimize_parallel_all_gather_comm : 0.019603s : 0.01% optimize.overlap_param_gather : 0.000006s : 0.00% optimize.cconv : 0.046176s : 0.03% optimize.loop_unroll : 0.018864s : 0.01% optimize.opt_after_cconv.c_1 : 0.143857s : 0.08% optimize.opt_after_cconv.parameter_eliminate : 0.000227s : 0.00% optimize.opt_after_cconv.updatestate_depend_eliminate : 0.027037s : 0.02% optimize.opt_after_cconv.updatestate_assign_eliminate : 0.026431s : 0.02% optimize.opt_after_cconv.updatestate_loads_eliminate : 0.026777s : 0.02% optimize.opt_after_cconv.cse : 0.077719s : 0.04% optimize.opt_after_cconv.renormalize : 0.000002s : 0.00% optimize.remove_dup_value : 0.067755s : 0.04% optimize.tuple_transform.d_1 : 0.124027s : 0.07% optimize.tuple_transform.renormalize : 0.860845s : 0.49% optimize.partial_unused_args_eliminate : 0.771413s : 0.44% optimize.add_cache_embedding : 0.006568s : 0.00% optimize.add_recomputation : 0.000007s : 0.00% optimize.cse_after_recomputation.cse : 0.089701s : 0.05% optimize.environ_conv : 0.018979s : 0.01% optimize.swap_dp_allreduce_reducescatter : 0.018656s : 0.01% optimize.bias_add_comm_swap : 0.000005s : 0.00% optimize.label_micro_interleaved_index : 0.000003s : 0.00% optimize.label_fine_grained_interleaved_index : 0.017155s : 0.01% optimize.merge_cast_opt : 0.000004s : 0.00% optimize.slice_recompute_activation : 0.000003s : 0.00% optimize.micro_interleaved_order_control : 0.000002s : 0.00% optimize.assign_add_opt : 0.062348s : 0.04% optimize.ForceFp32Comm : 0.000004s : 0.00% optimize.remove_cast_before_assign_add : 0.019036s : 0.01% optimize.full_micro_interleaved_order_control : 0.000005s : 0.00% optimize.reorder_send_recv_between_fp_bp : 0.000002s : 0.00% optimize.comm_op_add_attrs : 0.016797s : 0.01% optimize.add_comm_op_reuse_tag : 0.014021s : 0.01% optimize.interleave_split_concat_branches : 0.000002s : 0.00% optimize.interleave_parallel_branches : 0.000001s : 0.00% optimize.overlap_opt_shard_in_pipeline : 0.000011s : 0.00% optimize.overlap_opt_shard_grad_in_pipeline : 0.013420s : 0.01% optimize.control_data_broadcast_order : 0.000002s : 0.00% optimize.grouped_pairwise_exchange_alltoall : 0.000024s : 0.00% optimize.offloading_packed_experts : 0.000003s : 0.00% optimize.overlap_recompute_and_grad_model_parallel : 0.000002s : 0.00% optimize.overlap_grad_matmul_and_grad_allreduce : 0.000001s : 0.00% optimize.overlap_recompute_allgather_and_fa_grad : 0.000003s : 0.00% optimize.overlap_grad_ring_attention : 0.019116s : 0.01% optimize.overlap_grad_flash_sp : 0.020105s : 0.01% optimize.begin_end_overlap_inline : 0.000002s : 0.00% optimize.split_matmul_comm_elemetwise : 0.000005s : 0.00% optimize.split_layernorm_comm : 0.000003s : 0.00% optimize.handle_group_info : 0.000024s : 0.00% optimize.symbol_engine_optimizer.build : 0.008358s : 0.00% optimize.symbol_engine_optimizer.elim_shapecalc : 0.016860s : 0.01% optimize.symbol_engine_optimizer.elim_not_effective : 0.026478s : 0.02% optimize.symbol_engine_optimizer.opt_reshape : 0.017872s : 0.01% optimize.symbol_engine_optimizer.fold_const_symbol : 0.026546s : 0.02% optimize.symbol_engine_optimizer.renormalize : 0.000002s : 0.00% pipeline_parallel_scheduler : 0.025921s : 0.01% auto_monad_reorder : 0.057440s : 0.03% get_jit_bprop_graph : 0.000001s : 0.00% rewriter_after_jit_bprop_graph : 0.000001s : 0.00% eliminate_special_op_node : 0.055545s : 0.03% distribtued_split : 0.000004s : 0.00% validate : 0.035525s : 0.02% task_emit : 61.552356s : 35.39% execute : 0.000016s : 0.00% Time group info: ------[substitution.] 6.919706924801 0.03% : 0.002316s : 775: substitution.addn_check_dump 0.39% : 0.026889s : 3666: substitution.addn_zero_filter 0.08% : 0.005636s : 3562: substitution.adjust_all_reduce_mul_add 4.66% : 0.322379s : 96310: substitution.arithmetic_simplify 1.88% : 0.129896s : 33555: substitution.cast_eliminate 0.01% : 0.000352s : 45: substitution.compare_switch_simplify 0.57% : 0.039471s : 306: substitution.const_output_eliminate 0.46% : 0.031927s : 19098: substitution.depend_value_elim 0.04% : 0.002713s : 7063: substitution.elim_not_effective 0.00% : 0.000040s : 91: substitution.elim_shapecalc_of_broadcastargs 0.05% : 0.003686s : 2442: substitution.environ_get_add_eliminate 0.05% : 0.003139s : 2351: substitution.environ_get_depend_swap 0.11% : 0.007388s : 5425: substitution.environ_get_eliminate 0.10% : 0.007019s : 2442: substitution.environ_get_set_eliminate 0.00% : 0.000224s : 59: substitution.exchange_switch_depend_value 0.08% : 0.005769s : 4431: substitution.float_depend_g_call 0.03% : 0.002303s : 2983: substitution.float_environ_get_switch 0.27% : 0.018946s : 21681: substitution.float_tuple_getitem_switch 0.04% : 0.002757s : 7063: substitution.fold_const_symbol 9.28% : 0.641843s : 4796: substitution.getattr_setattr_resolve 0.53% : 0.036486s : 8622: substitution.graph_param_transform 0.01% : 0.000667s : 587: substitution.incorporate_call 0.01% : 0.000549s : 587: substitution.incorporate_call_switch 61.71% : 4.270389s : 73266: substitution.inline 0.41% : 0.028603s : 2711: substitution.inline_without_move 0.47% : 0.032530s : 42016: substitution.j_node_and_user_rematch 0.47% : 0.032378s : 5084: substitution.less_batch_normalization 0.55% : 0.037821s : 50955: substitution.load_eliminater 4.07% : 0.281530s : 3651: substitution.merge_addn 0.18% : 0.012329s : 91: substitution.micro_step_allgather_replace 0.29% : 0.019908s : 16427: substitution.minmaximum_grad 0.00% : 0.000184s : 313: substitution.opt_reshape 0.01% : 0.000379s : 162: substitution.parallel_virtual_node 0.00% : 0.000018s : 11: substitution.partial_defer_inline 0.35% : 0.024347s : 4537: substitution.partial_eliminate 0.00% : 0.000208s : 177: substitution.reduce_all_const_elim 0.06% : 0.004238s : 3273: substitution.reduce_eliminate 0.75% : 0.051783s : 42016: substitution.remove_not_recompute_node 1.70% : 0.117517s : 33425: substitution.replace_applicator 0.07% : 0.004699s : 7972: substitution.replace_old_param 0.02% : 0.001297s : 508: substitution.reset_defer_inline 0.48% : 0.033293s : 6821: substitution.reshape_eliminate 0.03% : 0.001751s : 1073: substitution.set_cell_output_no_recompute 0.04% : 0.002534s : 1065: substitution.specialize_transform 0.01% : 0.000896s : 546: substitution.split_environ_get_set_with_tuple_value 0.01% : 0.000445s : 79: substitution.switch_call_monad_eliminater 0.00% : 0.000286s : 59: substitution.switch_defer_inline 0.06% : 0.004255s : 1279: substitution.switch_simplify 0.08% : 0.005853s : 1750: substitution.transpose_eliminate 0.84% : 0.058171s : 22084: substitution.tuple_list_convert_item_index_to_positive 0.50% : 0.034254s : 25465: substitution.tuple_list_get_item_const_eliminator 0.87% : 0.060226s : 25165: substitution.tuple_list_get_item_depend_reorder 1.95% : 0.134918s : 55664: substitution.tuple_list_get_item_eliminator 0.62% : 0.043081s : 25283: substitution.tuple_list_get_set_item_eliminator 0.01% : 0.000924s : 115: substitution.tuple_list_set_item_eliminator 1.76% : 0.121887s : 114440: substitution.updatestate_pure_node_eliminater 2.95% : 0.204119s : 129335: substitution.updatestate_useless_node_eliminater 0.00% : 0.000230s : 42: substitution.value_based_eliminate 0.00% : 0.000030s : 1: substitution.virtual_dataset_eliminate ------[type_inference.] 32.693631 2 90.12% : 29.463839s : 1: type_inference.infer 9.88% : 3.229792s : 1: type_inference.specialize ------[replace.] 2.219051107424 0.09% : 0.002098s : 104: replace.addn_zero_filter 0.00% : 0.000031s : 3: replace.arithmetic_simplify 2.46% : 0.054625s : 6513: replace.cast_eliminate 0.19% : 0.004261s : 417: replace.depend_value_elim 1.89% : 0.041865s : 91: replace.environ_get_set_eliminate 3.85% : 0.085323s : 4173: replace.getattr_setattr_resolve 0.29% : 0.006378s : 2: replace.graph_param_transform 60.64% : 1.345592s : 62640: replace.inline 5.47% : 0.121424s : 2876: replace.merge_addn 0.07% : 0.001529s : 91: replace.micro_step_allgather_replace 0.26% : 0.005668s : 162: replace.parallel_virtual_node 3.53% : 0.078348s : 2924: replace.partial_eliminate 7.49% : 0.166220s : 6965: replace.replace_applicator 0.66% : 0.014711s : 1047: replace.reshape_eliminate 0.03% : 0.000695s : 5: replace.switch_call_monad_eliminater 1.04% : 0.023028s : 1140: replace.switch_simplify 0.09% : 0.002107s : 182: replace.tuple_list_get_item_const_eliminator 1.59% : 0.035326s : 3138: replace.tuple_list_get_item_depend_reorder 9.62% : 0.213488s : 13714: replace.tuple_list_get_item_eliminator 0.16% : 0.003440s : 118: replace.tuple_list_get_set_item_eliminator 0.10% : 0.002177s : 58: replace.tuple_list_set_item_eliminator 0.45% : 0.009876s : 1058: replace.updatestate_pure_node_eliminater 0.03% : 0.000769s : 1: replace.updatestate_useless_node_eliminater 0.00% : 0.000044s : 1: replace.value_based_eliminate 0.00% : 0.000028s : 1: replace.virtual_dataset_eliminate ------[match.] 5.252583107424 0.01% : 0.000467s : 104: match.addn_zero_filter 0.00% : 0.000059s : 3: match.arithmetic_simplify 0.52% : 0.027522s : 6513: match.cast_eliminate 0.01% : 0.000309s : 417: match.depend_value_elim 0.08% : 0.004070s : 91: match.environ_get_set_eliminate 11.17% : 0.586819s : 4173: match.getattr_setattr_resolve 0.44% : 0.023310s : 2: match.graph_param_transform 79.92% : 4.197885s : 62640: match.inline 5.23% : 0.274676s : 2876: match.merge_addn 0.23% : 0.012243s : 91: match.micro_step_allgather_replace 0.01% : 0.000274s : 162: match.parallel_virtual_node 0.39% : 0.020689s : 2924: match.partial_eliminate 0.65% : 0.034262s : 6965: match.replace_applicator 0.11% : 0.005579s : 1047: match.reshape_eliminate 0.01% : 0.000366s : 5: match.switch_call_monad_eliminater 0.05% : 0.002824s : 1140: match.switch_simplify 0.03% : 0.001472s : 182: match.tuple_list_get_item_const_eliminator 0.34% : 0.017802s : 3138: match.tuple_list_get_item_depend_reorder 0.77% : 0.040404s : 13714: match.tuple_list_get_item_eliminator 0.01% : 0.000531s : 118: match.tuple_list_get_set_item_eliminator 0.01% : 0.000492s : 58: match.tuple_list_set_item_eliminator 0.01% : 0.000489s : 1058: match.updatestate_pure_node_eliminater 0.00% : 0.000010s : 1: match.updatestate_useless_node_eliminater 0.00% : 0.000003s : 1: match.value_based_eliminate 0.00% : 0.000027s : 1: match.virtual_dataset_eliminate ------[predicate.] 6.91733628384199 0.97% : 0.066992s : 379437: predicate.accumulaten_eliminater 0.07% : 0.005054s : 8802: predicate.ad_related_special_op_eliminate 1.43% : 0.099194s : 270894: predicate.addn_check_dump 0.98% : 0.067493s : 379541: predicate.addn_zero_filter 0.96% : 0.066090s : 379437: predicate.adjust_all_reduce_mul_add 2.89% : 0.200106s : 650438: predicate.arithmetic_simplify 1.29% : 0.089006s : 435495: predicate.cast_eliminate 1.15% : 0.079214s : 203897: predicate.check_bprop_eliminate 1.44% : 0.099765s : 270894: predicate.compare_switch_simplify 0.06% : 0.004069s : 45078: predicate.const_output_eliminate 0.07% : 0.004975s : 8802: predicate.convert_tensor_all_eliminate 1.17% : 0.080700s : 404072: predicate.convert_tensor_eliminate 1.43% : 0.099007s : 271244: predicate.depend_value_elim 0.95% : 0.065465s : 387195: predicate.dict_get_item_const_eliminator 0.94% : 0.065172s : 387195: predicate.dict_get_item_eliminator 0.92% : 0.063918s : 387195: predicate.dict_set_item_eliminator 0.01% : 0.000740s : 8508: predicate.elim_not_effective 0.06% : 0.004423s : 8508: predicate.elim_shapecalc_of_broadcastargs 0.00% : 0.000307s : 2050: predicate.eliminate_switch_layer_partial_ 0.00% : 0.000310s : 2050: predicate.eliminate_switch_partial_ 1.16% : 0.080352s : 433944: predicate.environ_add_const_eliminate 1.16% : 0.080278s : 434035: predicate.environ_get_add_eliminate 1.16% : 0.080398s : 433944: predicate.environ_get_depend_swap 2.65% : 0.183544s : 704929: predicate.environ_get_eliminate 1.15% : 0.079388s : 434035: predicate.environ_get_set_eliminate 1.17% : 0.080707s : 470466: predicate.exchange_switch_depend_value 1.48% : 0.102513s : 470466: predicate.float_depend_g_call 1.45% : 0.100135s : 270894: predicate.float_environ_get_switch 1.69% : 0.117129s : 318633: predicate.float_tuple_getitem_switch 0.01% : 0.000734s : 8508: predicate.fold_const_symbol 0.28% : 0.019695s : 48440: predicate.get_grad_eliminate 0.21% : 0.014630s : 42397: predicate.getattr_setattr_resolve 0.01% : 0.000844s : 8622: predicate.graph_param_transform 1.41% : 0.097878s : 270894: predicate.incorporate_call 1.42% : 0.098199s : 270894: predicate.incorporate_call_switch 6.15% : 0.425675s : 1390592: predicate.inline 0.74% : 0.050880s : 101691: predicate.inline_without_move 0.06% : 0.004280s : 48480: predicate.j_node_and_user_rematch 0.31% : 0.021547s : 50259: predicate.less_batch_normalization 1.27% : 0.087643s : 460906: predicate.list_to_tuple_eliminator_ 2.24% : 0.154845s : 851496: predicate.load_eliminater 0.05% : 0.003615s : 9121: predicate.loop_unroll_after_grad 1.65% : 0.113852s : 288986: predicate.loop_unroll_before_grad 1.23% : 0.085264s : 456675: predicate.make_slice_get_slice_eliminator 1.50% : 0.103988s : 276475: predicate.merge_addn 1.02% : 0.070364s : 171733: predicate.micro_step_allgather_replace 1.04% : 0.072166s : 171642: predicate.mini_step_allgather_replace 0.90% : 0.062224s : 379544: predicate.minmaximum_grad 0.07% : 0.004678s : 8802: predicate.mutable_eliminate 0.07% : 0.004868s : 8508: predicate.opt_reshape 0.28% : 0.019484s : 47059: predicate.parallel_virtual_node 2.89% : 0.199850s : 470466: predicate.partial_defer_inline 1.18% : 0.081753s : 427296: predicate.partial_eliminate 0.94% : 0.065122s : 379437: predicate.print_const_string_wrapper 1.43% : 0.098576s : 270365: predicate.reduce_all_const_elim 1.11% : 0.076532s : 379544: predicate.reduce_eliminate 0.06% : 0.004129s : 48480: predicate.remove_not_recompute_node 0.90% : 0.062310s : 594949: predicate.replace_applicator 0.13% : 0.009033s : 101691: predicate.replace_old_param 0.06% : 0.004364s : 47739: predicate.reset_defer_inline 0.97% : 0.067318s : 380591: predicate.reshape_eliminate 1.00% : 0.069076s : 171642: predicate.row_tensor_add_zeros_like 0.10% : 0.006883s : 17014: predicate.row_tensor_eliminate 1.16% : 0.080051s : 203897: predicate.same_eliminate 0.32% : 0.021849s : 236185: predicate.set_cell_output_no_recompute 0.28% : 0.019645s : 48440: predicate.shard_identity_eliminate 0.93% : 0.064294s : 148531: predicate.special_op_eliminate 1.67% : 0.115648s : 276646: predicate.specialize_transform 1.14% : 0.079102s : 171551: predicate.split_environ_get_set_with_tuple_value 0.29% : 0.019744s : 101691: predicate.stack_unstack_eliminate 2.19% : 0.151462s : 851491: predicate.stopgrad_eliminater 0.05% : 0.003180s : 18250: predicate.switch_call_monad_eliminater 1.26% : 0.086951s : 470466: predicate.switch_defer_inline 2.35% : 0.162785s : 674363: predicate.switch_layer_defer_inline 4.37% : 0.302416s : 1038378: predicate.switch_simplify 0.95% : 0.066024s : 379544: predicate.tile_eliminate 0.94% : 0.065256s : 379544: predicate.transpose_eliminate 1.33% : 0.092023s : 443494: predicate.tuple_list_convert_item_index_to_positive 1.33% : 0.092061s : 446990: predicate.tuple_list_get_item_const_eliminator 1.27% : 0.087728s : 446632: predicate.tuple_list_get_item_depend_reorder 2.95% : 0.204207s : 731792: predicate.tuple_list_get_item_eliminator 1.34% : 0.092701s : 446750: predicate.tuple_list_get_set_item_eliminator 2.87% : 0.198199s : 717702: predicate.tuple_list_set_item_eliminator 1.30% : 0.090231s : 460906: predicate.tuple_to_list_eliminator_ 2.26% : 0.156571s : 852554: predicate.updatestate_pure_node_eliminater 3.74% : 0.258749s : 1123449: predicate.updatestate_useless_node_eliminater 0.28% : 0.019124s : 46841: predicate.value_based_eliminate 0.29% : 0.019919s : 48442: predicate.virtual_dataset_eliminate 0.28% : 0.019489s : 48440: predicate.virtual_output_eliminate 0.28% : 0.019186s : 48214: predicate.zero_like_fill_zero ------[func_graph_cloner_run.] 6.411449245154 42.41% : 2.719331s : 19635: func_graph_cloner_run.FuncGraphClonerGraph 34.86% : 2.234894s : 202894: func_graph_cloner_run.FuncGraphClonerNode 22.73% : 1.457224s : 22625: func_graph_cloner_run.FuncGraphSpecializer ------[meta_graph.] 0.000000 0 ------[manager.] 0.000000 0 ------[pynative] 0.000000 0 ------[others.] 313.857611 907 0.00% : 0.000011s : 1: ForceFp32Comm 9.16% : 28.746637s : 1: a1a2 0.00% : 0.006604s : 1: add_cache_embedding 0.00% : 0.014068s : 1: add_comm_op_reuse_tag 0.00% : 0.000014s : 1: add_recomputation 0.02% : 0.062394s : 1: assign_add_opt 0.33% : 1.042151s : 1: auto_monad 0.02% : 0.057500s : 1: auto_monad_reorder 0.00% : 0.000014s : 1: begin_end_overlap_inline 0.00% : 0.000012s : 1: bias_add_comm_swap 0.01% : 0.032769s : 1: bootstrap 0.01% : 0.046213s : 1: cconv 0.01% : 0.016842s : 1: comm_op_add_attrs 0.00% : 0.000010s : 1: control_data_broadcast_order 0.02% : 0.072930s : 1: convert_after_rewriter 0.03% : 0.089802s : 1: cse_after_recomputation 0.02% : 0.075629s : 1: dataset_repeat_opt 0.00% : 0.000014s : 1: distribtued_split 0.02% : 0.055585s : 1: eliminate_special_op_node 0.01% : 0.019024s : 1: environ_conv 0.00% : 0.000025s : 1: execute 0.00% : 0.000013s : 1: full_micro_interleaved_order_control 0.00% : 0.000012s : 1: get_jit_bprop_graph 0.24% : 0.749206s : 1: graph_reusing 0.00% : 0.000094s : 1: grouped_pairwise_exchange_alltoall 0.00% : 0.000029s : 1: handle_group_info 9.26% : 29.060261s : 1: inline 0.01% : 0.038483s : 1: insert-virtual-dataset 0.00% : 0.000005s : 1: interleave_parallel_branches 0.00% : 0.000011s : 1: interleave_split_concat_branches 0.01% : 0.017195s : 1: label_fine_grained_interleaved_index 0.00% : 0.000007s : 1: label_micro_interleaved_index 0.01% : 0.018886s : 1: loop_unroll 0.00% : 0.000012s : 1: merge_cast_opt 0.00% : 0.000007s : 1: micro_interleaved_order_control 0.00% : 0.000008s : 1: offloading_packed_experts 8.67% : 27.213705s : 83: opt.transform.a1a2 0.01% : 0.016664s : 1: opt.transform.loop_unroll_optimizer 3.92% : 12.300017s : 189: opt.transform.opt_a 0.05% : 0.143843s : 2: opt.transform.opt_after_cconv 0.74% : 2.316175s : 132: opt.transform.opt_b 0.26% : 0.810727s : 404: opt.transform.opt_resolve 0.04% : 0.124010s : 1: opt.transform.opt_trans_graph 0.00% : 0.015219s : 1: opt.transform.partial_eliminate 0.02% : 0.054546s : 3: opt.transform.special_op_eliminate 0.03% : 0.087718s : 4: opt.transform.symbol_engine_opt 13.12% : 41.180589s : 1: opt_a 0.10% : 0.302506s : 1: opt_after_cconv 0.87% : 2.720402s : 1: opt_b 14.98% : 47.029697s : 1: optimize 0.01% : 0.019638s : 1: optimize_parallel_all_gather_comm 0.02% : 0.063193s : 1: order_py_execute_after_rewriter 0.01% : 0.020154s : 1: overlap_grad_flash_sp 0.00% : 0.000005s : 1: overlap_grad_matmul_and_grad_allreduce 0.01% : 0.019154s : 1: overlap_grad_ring_attention 0.00% : 0.013460s : 1: overlap_opt_shard_grad_in_pipeline 0.00% : 0.000016s : 1: overlap_opt_shard_in_pipeline 0.00% : 0.000013s : 1: overlap_param_gather 0.00% : 0.000007s : 1: overlap_recompute_allgather_and_fa_grad 0.00% : 0.000005s : 1: overlap_recompute_and_grad_model_parallel 0.05% : 0.145039s : 1: parallel-infer-symbol 0.00% : 0.000019s : 1: parallel-infer-symbol-second 0.25% : 0.771446s : 1: partial_unused_args_eliminate 0.01% : 0.025975s : 1: pipeline_parallel_scheduler 0.38% : 1.179948s : 1: pipeline_split 2.03% : 6.359881s : 1: pre_auto_parallel 0.01% : 0.033636s : 1: py_interpret_to_execute 0.01% : 0.038010s : 1: py_interpret_to_execute_after_opt_a 0.01% : 0.019076s : 1: remove_cast_before_assign_add 0.02% : 0.067801s : 1: remove_dup_value 2.69% : 8.453095s : 6: renormalize.infer 1.91% : 5.989192s : 6: renormalize.specialize 0.00% : 0.000006s : 1: reorder_send_recv_between_fp_bp 0.00% : 0.000008s : 1: rewriter_after_jit_bprop_graph 0.07% : 0.234069s : 1: rewriter_after_opt_a 0.11% : 0.352784s : 2: rewriter_before_opt_a 0.01% : 0.022601s : 1: slice_cell_reuse_recomputed_activation 0.00% : 0.000006s : 1: slice_recompute_activation 0.00% : 0.000006s : 1: split_layernorm_comm 0.00% : 0.000009s : 1: split_matmul_comm_elemetwise 0.01% : 0.018698s : 1: swap_dp_allreduce_reducescatter 0.03% : 0.096455s : 1: symbol_engine_optimizer 19.61% : 61.552415s : 1: task_emit 0.31% : 0.985037s : 1: tuple_transform 10.44% : 32.773861s : 1: type_inference 0.02% : 0.064599s : 1: validate